Pass all the Breakpoints

When modifying a system without adequate tests, I found it helps to include debugger breakpoints in my manual testing checklist:

For each change I make, I add a breakpoint after the modified line, and won’t remove it until the interpreter has passed through it exercising the logic to my satisfaction. Before committing the changes I can pull up the list of breakpoints to make sure I’ve hit them all.

This is probably most valuable when I’ve made a lot of changes at once, or if temporary delusion is making me think I don’t need to test everything.

Caveat 1: I suppose, if there’s any parallel operations, you should also test everything with the breakpoints gone, lest the code relies on losing a race condition.

Caveat 2: In no way is this a substitute for automated tests!

Freedom by vote

I’m already seeing folks in my Twitter feed assuring themselves that Ireland’s recent marriage equality referendum could never be repealed. The danger of freedom-by-majority is that public opinion is fickle, and a shift in voter turnout can have a huge effect. No doubt large numbers of Californians against Prop 8 assured themselves that it could never pass and didn’t come out to vote.

So, for Irish freedom-lovers, pat yourselves on the back, but consider the repeal efforts a serious threat.

Update: In this case it’s unlikely the demographical and cultural shift to acceptance will swing back, and it looks like this could not be repealed by simple vote. In fact, putting it to a popular vote might’ve been a wise move even if it would’ve been non-binding; it got people talking and gave the public an anonymous way to voice their support.

Comedy Bang Bang podcast primer

For those only familiar with the TV show, the podcast eps are much longer, unscripted, at times not at all SFW, and frequently funnier than the show. I tried to pick densely funny episodes, but my favorites have more Andy Daly characters.

http://www.earwolf.com/episode/the-calvins-twins/
http://www.earwolf.com/episode/introducing-huell-howser/
http://www.earwolf.com/episode/this-is-not-me-this-is-them/
http://www.earwolf.com/episode/live-from-riot-la-2/
http://www.earwolf.com/episode/poehler-ice-caps/
http://www.earwolf.com/episode/enigma-force-five-reunion/
http://www.earwolf.com/episode/the-worlds-end/
http://www.earwolf.com/episode/halfway-to-china/
http://www.earwolf.com/episode/penises-abounding/

Sometimes liberty is finite and must be redistributed

The Civil Rights Act was a triumph of people willing to recognize that, to increase the liberty of black Americans, you had to reduce the freedom of whites to practice discrimination. By 1964, the 1st Amendment’s principles of freedom of association and federalism (emphasis mine):

… had been used as weapons against black Americans, and esoteric concerns seem less important than being unable to eat or get a hotel you’re willing and able to pay for as you drive across your own country. This sort of adherence to principle at the expense of the tangible freedom of millions of African Americans sent a clear message of whose liberty received priority.

Of course the unwillingness of motels and restaurants to serve blacks was just the tip of the iceberg.

Obamacare is also redistributing liberty. Millions of Americans once shut out of the health insurance market (who could otherwise pay) are now able to gain coverage, which is a huge deal. Of course, there was no way to do that other than to force insurers to accept them, which requires the other two legs of the stool.

The first piece above also confronts libertarians for not seeing the world as it is, by embracing a view that racism has abated to the point where it doesn’t affect the daily lives of black Americans. Unfortunately this view is in no way limited to libertarians and couldn’t be more wrong. When people think that government interference in markets or taxation are the major things that blacks have to fear, they’re going to propose policies that are hopelessly out of touch with the real world.

Related: Ta-Nehisi Coates talks about the damage done by housing discrimination:

Elegant racism is invisible, supple, and enduring. It disguises itself in the national vocabulary, avoids epithets and didacticism. Grace is the singular marker of elegant racism. One should never underestimate the touch needed to, say, injure the voting rights of black people without ever saying their names. Elegant racism lives at the border of white shame. Elegant racism was the poll tax. Elegant racism is voter-ID laws.

…If you sought to advantage one group of Americans and disadvantage another, you could scarcely choose a more graceful method than housing discrimination. Housing determines access to transportation, green spaces, decent schools, decent food, decent jobs, and decent services. Housing affects your chances of being robbed and shot as well as your chances of being stopped and frisked. And housing discrimination is as quiet as it is deadly. It can be pursued through violence and terrorism, but it doesn’t need it. Housing discrimination is hard to detect, hard to prove, and hard to prosecute. Even today most people believe that Chicago is the work of organic sorting, as opposed segregationist social engineering. Housing segregation is the weapon that mortally injures, but does not bruise.

Bug Fixes: the Hidden Value of JS Libraries

Paul Irish points out a huge value added from jQuery that few people (even the jQuery home page) seem to acknowledge, which is that it transparently works around dozens of browser bugs. If you consider older versions of jQuery this is probably more like hundreds of bugs, and many of them have been big showstoppers that even crashed browsers.

Fans of the “vanilla.js” movement—an understandable reaction to mindless library use—tend to understate what a minefield browser APIs tend to be when you consider the wide variety of browsers in use (take a sobering dip into your site’s browser stats). There’s nothing wrong with using and learning the real APIs, of course. There are big sections of DOM, Events, and friends that are very well supported, but if you’re not extremely careful, as soon as your site picks up traffic you can count on “vanilla” code breaking for users with browsers that don’t auto-update.

The moral: Even if you’re a JS pro and know the native APIs, if you’re doing anything substantial with them, then using jQuery—or some other battle-tested library—isn’t so mindless; you’re improving UX for some end users.

Why doesn’t someone create a library that just fixes the bugs? Dean Edwards was an early experimenter here with base2 and IE7.js, but as far as I remember these were mostly combed over by the JS nerds who were working on their own libs. Combining workarounds with other useful bits just gave you more bang for the buck, and, frankly, the standard APIs like DOM have never been particularly elegant to work with anyway (it took the W3C way too long to even acknowledge that giving devs the ability to parse markup (innerHTML) was something browsers might want to offer!).

On Michael Brown and Darren Wilson

Dumping my thoughts here.

Brown and his family and community obviously got a rotten deal here. It seems very unlikely that Wilson’s behavior was completely appropriate; that Brown would simply attack him for no reason a few days away from starting school. I could imagine a scenario in which Brown was rushing Wilson as a form of self protection. That Wilson did not use a taser or some other method of de-escalation was also a very unfortunate error that he should pay for.

Cameras need to be rolling. Any officer with lethal weapons should be wearing one, and it should auto-activate whenever the officer touches a weapon or moves quickly; better to accidentally capture unneeded footage. Funding is going to be a problem here because—it’s my impression that—high crime areas also tend to be underfunded. We should fix that.

Wilson, and probably any officer left in such a situation (having killed an unarmed citizen with no immediate video evidence) should be arrested to show seriousness in delivering justice. Police unions will obviously fight such a policy, but hopefully this will show how failing to do so can make an officer’s life much worse and reduce the credibility of the entire profession. Wilson will be known by many as a murderer of an unarmed teen regardless of what really happened, and that’s not how justice should work.

When he goes to trial it’s hard to imagine anyone being happy with the outcome. Brown was big, tall and could clearly intimidate, and the type of individuals who make it into juries I suspect will very much believe a uniformed officer. A video of Wilson pacing after the shooting won’t prove wrongdoing. Brown’s family would be wise to bring a civil suit against the PD and I’m sure they will. Lots of Ferguson’s citizens and press agencies should sue them. Payoffs change behavior.

The St. Louis Police have, through their incompetence at crowd-handling and arresting of press members, done the country a great service in raising public awareness of the problem of police militarization. Hopefully this will change policies that currently help local police dress like soldiers and bring warfare tactics to U.S. streets.

It’s sad that people will use a peaceful protest as an excuse to loot local businesses and attack officers. Protesters that deny this stuff is happening lose credibility.

I’m conflicted about the wisdom of protesting in the middle of the night. On one hand this will give cover to miscreants and increase the danger to everyone. On the other hand this undoubtedly is helping keep Ferguson and the issues its facing in the public eye. It’s hard to change policies via polite daytime picketing.

Change happens when journalists are chased, shot in the back with rubber bullets, and arrested.

Frameworks and Developer Happiness

Jake Archibald tweeted this comic expressing (I’m interpreting here):

  • There’s a difference between “using libraries” and “using frameworks”
  • Even if you don’t understand the components themselves, when using libraries:
    • there are fewer components in the system
    • the program flow through the components is clearer

I believe this is completely accurate, in that a lot of developers feel this way about frameworks, but I don’t think it’s due to a big difference between using libraries vs. a framework. I think it’s rather a matter what kind of environments make developers happy.

Devs prefer a higher familiarity with the codebase

If you write an application “using libraries” it will always feel more comfortable. It will be crafted around your biases (your preferred configuration format and form, file layout, DI and other libraries, etc.) and it will be simple enough to meet just the use cases you foresee at the moment. Over time added features will force you to make more decisions about new components and refactoring. But no doubt you will have written a framework. Did you make objectively better decisions than those working on Framework (a public project that calls itself a “framework”), who maybe were also building on top of libraries? Maybe, maybe not, but you’ll probably feel better about those decisions, and when you look at the more complex code, you’ll remember why that code was needed and forgive the complexity.

But a new developer on this project won’t have the same biases, she’ll be overwhelmed by those complexities (which look unnecessary), and to her it will feel just like a Framework.

Code hosting sites are littered with skeleton apps built from libraries that have little or no documentation and would be difficult for a developer without the same biases to jump into. And every use case that’s had to be added has made those frameworks more complex and more impenetrable. At a certain level everything starts to look like Symfony, but frequently without the documentation and support community. An author that builds something such that the choices made were obvious may be less motivated to document it.

Devs prefer less complex systems that do a few things really well.

Large organizations maintain a variety of enterprisey apps like PeopleSoft that do 1E9 things to support 1,000s of business processes, and I feel for the folks “toiling in the Java mines” on these systems; it looks like messy, unglamorous work, and where each new feature has an impact on dozens of others. I think the sheer size of some Frameworks remind developers of these kind of nightmare scenarios.

Smaller projects with fewer use cases always enable a simpler framework around the business logic, and so any Framework that you’re not already very familiar with is going to seem like overkill. And it will be right up to the point where the project becomes complex or the original authors leave the team.

What to make of this

I guess my point is that, all code quality being equal and over time, there’s not a big objective quality difference between the framework you rolled from libraries vs the one downloaded that others rolled from libraries. But I recognize that its subjectively enjoyable to build them and to work on systems where you’re productive. And that matters.

Sorta-related aside: There’s an interesting tug-of-war dynamic between developers and management tasked with keeping a piece of software maintained. A lot of the web is geared towards hastily building something sexy and throwing it away if the product doesn’t take off, and so you want devs to create and use whatever they’re most productive in. But if you’re maintaining an internal business app that will certainly be critical for the foreseeable future—and one that devs will not tolerate working on it for long!—you have to optimize your dev processes for developer turnover, while simultaneously trying to keep them happy. Introducing any technology with a potentially short lifespan introduces big risk.

Elgg’s Path Forward

Like many older PHP projects, Elgg has lots of problems with tight coupling, procedural patterns, and untestability; and has a very web 1.9 model: spit out full page, add a little Ajax. The good news is that Elgg has a ton of great functionality and ideas embedded in that mess, we have a core team which often can find agreement about dev principles and goals, and we have a new schedule-based release process that ensures that hard work going into the product makes it to release more quickly.

Lately I feel like the Elgg core team is excitedly gearing up for a long hike, during which we’ll make tons of hard decisions and churn lots of code remolding Elgg to look more like a modern JavaScript + PHP API framework.

I’m not sure I want to make that hike.

My suspicion is there’s a shorter route around the mountain; some modern framework may be out there whose team has already put in the hard effort of building something close to what we’re looking forward to. I think the time it would take us to get there would be long and filled with tons of wheel-rebuilding—work that won’t be going into improving UX and which provides no cross-project knowledge gain for Elgg devs.

I’m also wondering if we would be wise to ignore our itching about back end code quality for a bit and focus all attention on the front end and on UI/UX problems. As a plugin developer, I certainly see back end design choices that cause problems, but they’re rarely blockers. I spend a lot more time improving the UX and dealing with our incomplete Ajax implementation. The jewel of the 1.9 release isn’t going to be the dependency container and PSR-0 compatible autoloading; it’ll be the responsive Aalborg theme.

For me, back end refactoring work is fun because it’s relatively easy. You’re changing the way the pieces snap together, not necessarily making them work better or solving new problems. It also keeps me in the comfort zone of working mostly with code and people I’m already familiar with. This is OK for a little while but doesn’t push me to grow.

This isn’t to imply that the core team is infected with Not-invented-here. We definitely want to replace as much home-grown code as possible with well-tested alternatives maintained elsewhere. It’s just a hunch I have that this will be a long process involving tons of decisions that have already been made somewhere else.

I’m still having a lot of fun developing for and in Elgg, but I’m itching to pick up something new, and to work in a system that’s already making good use of and establishing newer practices. Hitching Elgg to another project’s wagon seems adventurous.

I also have to vent that the decision to maintain support for PHP 5.2—a branch that ended long-term support 3.5 years ago—during 1.9.x seems disastrously wrong. 1.9 had a long development process during which a significant amount of high-quality, highly-tested, and actively maintained community code was off-the-table because it wasn’t 5.2 compatible. We had to port some things to 5.2 and fix the resulting bugs, and some unit tests are a mess without Closures; just a huge waste of time. Nor could we benefit from the work being done on Drupal or WordPress because both are GPL, as are a lot of other older PHP projects with 5.2-compatible code. PHP 5.2 is still expressive enough to solve most problems without namespaces, Closures, et al., but in 2014 devs don’t want to code with hands tied behind their backs to produce less readable code that will soon have to be refactored. /rant

Get rid of variable variable syntax

Uniform Variable Syntax was voted in (almost unanimously) for PHP6 PHP7 and introduces a rare back compatibility break, changing the semantics of expressions like these:

                           // old meaning            // new meaning
1. $$foo['bar']['baz']     ${$foo['bar']['baz']}     ($$foo)['bar']['baz']
2. $foo->$bar['baz']       $foo->{$bar['baz']}       ($foo->$bar)['baz']
3. $foo->$bar['baz']()     $foo->{$bar['baz']}()     ($foo->$bar)['baz']()
4. Foo::$bar['baz']()      Foo::{$bar['baz']}()      (Foo::$bar)['baz']()

IMO the “variable variable” syntax $$name is a readability disaster that we should get rid of. ${$name} is much clearer about what this is (dynamic name lookup) and what it’s doing: $ is the symbol table, and {$name} tells us that we’re finding an entry in it under a key with the value of $name. PHP should deprecate $$name syntax and emit a notice in the next minor version.

It should also deprecate the syntax ->$name and ::$$name. Both are bad news.

Doing so would completely eliminate the first 3 ambiguous expressions above, and the new warning emitted would call out code that would otherwise silently change meaning in PHP6 (one big negative about this RFC).

As for the fourth, Consider these expressions:

A. $foo->prop
B. $foo->prop()
C. $foo->prop['key']()
D. Foo::$prop
E. Foo::$prop()
F. Foo::$prop['key']()

To the chagrin of JavaScript devs, PHP will not let you reference a dynamic property in an execution context, so expression B will try to call a method “prop”, (and fatal if it can’t). For consistency, PHP should fatal on expression E. Currently PHP just does something completely unexpected and insane, looking up a local variable $prop.

Expression C does what you’d expect, accessing the property named “prop”, so expression F should do the same, which is why I think the RFC is a clear step forward at least.

What, when, and how is your IDE telling you?

A programmers.stackexchange question basically asks if there’s an ideal frequency to compile your code. Surely not, as it completely depends on what kind of work you’re doing and how much focus you need at any given moment; breaking to ask for feedback is not necessarily a good idea if you’re plan is solid and you’re in the zone.

But a related topic is more interesting to me: What’s the ideal form of automated information flow from IDE to programmer?

IDEs can now potentially compile/lint/run static analysis on your code with every keystroke. I’m reminded of that when writing new Java code in NetBeans. “OMG the method you started writing two seconds ago doesn’t have the correct return type!!!” You don’t say. And I’ve used an IDE that dumped a huge distracting compiler message in the middle of the code due to a simple syntax error that I could’ve easily dealt with a bit later. I vaguely remember one where the feedback interfered with my typing!

So on one side of the spectrum the IDE could be needling you, dragging you out of the zone, but you do want to know about real problems at some point. Maybe the ideal notification model is progressive, starting out very subtle then becoming more obvious as time passes or it becomes clear you’re moving away from that line.

Anyone seen any unique work in this area?

Stepping back to the notion of when to stop and see if your program works, I think the trifecta of dependency injection, sufficiently strong typing, and solid IDE static analysis has really made a huge difference in this for me. Assuming my design makes sense, I find the bugs tend to be caught in the IDE so that programs are more solid by the time I get to the slower—and more disruptive to flow—cycle of manual testing. YMMV!