Friday, October 30, 2015

Java Applets Don't Run In Chrome Browser

Have you noticed that some web pages have previously dynamic sections that no longer work in Chrome?

Have you seen a message stating, "Chrome no longer supports NPAPI"?

That's because the Chrome Browser no longer supports java applets... and for good reason: Running java applets in your Chrome browser is a security risk.

Options for the Die Hards

If you are determined to run that web page with that non-functional java applet, regardless of it's security implications, you have options:

Run Java Applets By

  • Using a browser that still supports NPAPI (MS IE, Safari, Firefox)
  • Use the IE Tab plugin for Chrome (for Windows platform)
  • Convert Java Applet to a Web Start application (if you can influence development)


Google's Chrome version 45 (scheduled for release in September 2015) drops support for NPAPI, impacting plugins for Silverlight, Java, Facebook Video and other similar NPAPI based plugins.

Netscape Plugin Application Programming Interface (NPAPI) is an application programming interface (API) that allow plug-ins (more specifically, browser extensions) to be developed for web browsers.

It was first developed for Netscape browsers, starting in 1995 with Netscape Navigator 2.0, but was subsequently adopted by other browsers.

In NPAPI architecture, a plugin declares content types (e.g. "audio/mp3") it can handle. When the browser encounters a content type it cannot handle natively, it loads the appropriate plugin, sets aside space within the browser context for the plugin to render and then streams data to it. The plugin is responsible for rendering the data. The plugin runs in-place within the page, as opposed to older browsers that had to launch an external application to handle unknown content types.

NPAPI requires each plugin to implement and expose approximately 15 functions for initializing, creating, destroying and positioning plugin content. NPAPI also supports scripting, printing, full-screen plugins, windowless plugins and content streaming.

Full privileges are only granted by default to chrome scripts.


Mozilla is deprecating all plugins.

"Plugins are now a legacy technology. They are not available on most mobile devices. Mozilla encourages website developers to avoid using plugins wherever possible. If there are plugin features which are not available in the web platform, we encourage developers to post their use cases to project list, so that Mozilla can prioritize web platform work to make those use cases possible."

Note that plugins are shared libraries that users can install to display content that the application itself can't display natively. For example, the Adobe Reader plugin lets the user open PDF files directly inside the browser, and the QuickTime and RealPlayer plugins are used to play special format videos in a web page.


If you developed a java applet for a web page and deployed it to production, you might want to keep fact that off your resume.

Running java in a web browser was never a good idea.

The java applet is executed within a bloated Java Virtual Machine (JVM) in a process separate from the web browser itself. The java plugin was designed to run the java applets in a "secure sandbox" in the browser. This would supposedly prevent any java applet from presenting security risks to your computer.

The reality is that there have been so many vulnerabilities that allow nefarious Java applet code to escape the sandbox and exploit your system that Oracle has basically given up.

Java will no longer run unsigned applets, unless you go to the trouble of reducing your browser's default security settings. Running unsigned applets shouldn’t be a problem if the security sandbox were trustworthy in the first place. Right?

Furthermore, the graphics generated from Java apps, IMHO, never were crisp and/or visually appealing.

Cisco’s 2014 annual security report claims that 91 percent of all web attacks in 2013 targeted Java.

Running java applets in a browser will be insecure, slow, have high resource requirements and look sub-par; So, don't do it.


This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Wednesday, October 28, 2015

The Hand-Crafted SQL Anti Pattern

You probably should have used an ORM if...

  • your app has a lot of SQL statements that exceed 200 lines
  • you use multiple CTE (WITH Clauses) to deal with the complexity of your looong SQL statements
  • you frequently implement helper functions that return SQL clauses
  • you frequently refactor your helper functions that return SQL clauses
  • you find it takes more time to add functionality to your SQL queries than it took to design the database DDL, indexes and write that super long SQL in the first place

Those are a few indications of deeper issues.

Before you discount the "exceed 200 lines" or "multiple CTE" items, please consider that the reason for having sql helper clauses is typically to reduce the complexity of the sql statement; If the sql statement needs simplification then odds are it's too darn long; Hence, the "exceeds 200 lines" and "multiple CTE" remarks.

Another rule of thumb of mine... If any logic (any language) exceeds 150 lines long, it's probably too long and should probably be refactored.

Technical Debt

With a lot of hard coded SQL, you'll likely encounter problems down the road like...

SQL Errors you might get with Pagination

In this case, we were using a WITH clause and wanted to enforce a new sorting order...

ERROR:  SELECT DISTINCT ON expressions must match initial ORDER BY expressions
LINE 52: select distinct on (
********** Error **********

ERROR: SELECT DISTINCT ON expressions must match initial ORDER BY expressions
SQL state: 42P10
Character: 1213

We learn by ERROR that when we try to add the flexibility of changing the sort order, we have new, unforeseen problems to solve which are a direct result of our legacy, hard-coded SQL.

Reasons to use a lot of hard-coded SQL, instead of an ORM...

  • Your business objects are not easy to map to database tables
  • Your application needs lot of complex, hand-tuned SQL
  • You use a lot of stored procedures and complex logic in your database

If this is your case, you should probably revisit your database design and Data Manipulation Language (DML) implementation strategies and actively reduce your technical debt before you are asked to maintain all that hard-coded SQL.

Intelligent enterprise application design begins with information architecture and a solid database design.

If you frequently feel that it takes a lot longer to add functionality to your application than it should, you should closely examine your application architecture.


Q: Am I saying that you should always use an ORM to solve your all of your DML needs?

A: No.

However, I am saying that if the vast majority of your DML code is hard-coded SQL, you should probably find a good ORM that fits in your technology stack and learn how to use it.


None. That's just my hard-knock experience speaking.

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Sunday, October 25, 2015

Handling Errors in Go

Using Golang best practices for handling errors, we handle the error ASAP if it's not nil.

This avoids nested blocks of logic and is generally a good idea in any language.

Frequently, we want to return an error message in the form of a formatted string.

First Try

This technique requires both fmt and error packages to be imported.

   err = db.Select(&customer, "SELECT * FROM customer WHERE id = ?", 999)
   if err != nil {
      err = errors.New(fmt.Sprintf("Unable to process customer: %v [Error: %+v]", id, err))

Better - Use fmt.Errorf

  • Fewer imports (only requires fmt)
  • Fewer function calls
  • Less code

   err = db.Select(&customer, "SELECT * FROM customer WHERE id = ?", 999)
   if err != nil {
      err = fmt.Errorf("Unable to process customer: %v [Error: %+v]", id, err)

Generally speaking...

  • the fewer packages you include, the less there is to break now and in the future
  • the fewer function calls you make, the faster your code is going to run
  • the less code you have, the less code you have to review, compile and maintain

Okay, But Can Do Even Better?

Granted, we've simplified our code, but if we use the New function from the errors package to create package level, exported defined error variables we can return those to a caller that can compare them.

Define Errors

Define errors in customer package:

ErrCustomerNotFound := errors.New("customer: id not found")
ErrTimeout := errors.New("customer: timeout")

Compare Error

response, err := processCustomer()
if err != nil {
    switch err {
    case customer.ErrCustomerNotFound:
        // Handle customer not found error.
    case customer.ErrTimeout:
        // Handle timeout error.
        // General error handler

Return Error Interface Type

It is idiomatic in Go to use the error interface type as the return type for any error that is going to be returned from a function or method.

This interface is used by all the functions and methods in the standard library that return errors.

For example, here is the declaration for the Get method from the http package:

func (c *Client) Get(url string) (resp *Response, err error)

Since we have defined our err return argument in our function definition, we can use a simple return command and our err value will be returned through the error interface type definition.

Even Better

Go's design encourages us to explicitly check for errors where they occur; Contrast this with other languages that encourage throwing exceptions and sometimes catching them.

However, this can lead to a lot of error handling code.

How can we reduce our code and add features, like adding an error code, that can be leveraged to support I18N requirements?

If we have one service handler which is the only place that handles processCustomer calls, we can return a service-specific error object.

type svcError struct {
    Code    int
    Error   error

The code can be used to lookup the appropriate, localized message and the original error can be passed along to the handler.

func processCustomer(req *svc.Request, resp svc.Response) *svcError {    
    if custID, err := req.Get("id"); err != nil {
        return &svcError{CustomerIdNotFound, err}
    return nil

It helps to define error code constants.

const CustomerIdNotFound = 10001


  • Allow svcError to take a slice of parameters that can be used to construct a more complicated error message
  • Include a StackTrace
  • Handle panics in svcError and report a more user friendly message to the consumer


This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Sunday, October 11, 2015

The NoSQL Big Data Face Palm

I have found in practice that the use of NoSQL databases to solve Big Data problems largely exist where there was an architect that (pick one or more):
  • Thought a NoSQL technology could provide reliable, believable and accessible data that everyone in the corporation can rely on
  • Wanted to try a new technology and did not foresee the technical debt that would accrue overtime where strong schema enforcement is not baked into the database solution
  • Was more interested in a quick win with a promising technology than a sustainable, long term architecture that required significant database design and forethought
  • Failed to understand when to use OLTP and when to use OLAP and how data should be modeled and how it should flow from a live system to a reporting/analytics system
  • Did not understand how to use Database Sharding to achieve high performance with distributed large data stores
  • Thought designing for ease of programming at the cost sparseness of data storage was a best practice.
  • Thought, "Disk storage is cheap.", but failed to take IO into account.
  • Thought, "We have so much data, we won't consider backup or disaster recovery".
  • Failed to account for the effects of technology churn of their new technology
  • Does this a lot:  


I am not saying that technologies like Cassandra, Riak, Hadoop, MongoDB, etc., should have no place in any corporate portfolio.

(There are many use cases where the capability of storing unpredictable and/or amorphous data is a necessity, but often times there will be a relational database that contains the metadata to make sense of the noSQL data.)

I am saying that implementations and deployments of those technologies have caused a lot of data integrity issues and should be thoughtfully considered before adoption.


This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Tuesday, October 6, 2015

El Capitan, Homebrew and Java

It's a good idea to upgrade to El Capitan.

As usual, with major updates things break.

In this post we'll look at how to get homebrew and Java functioning properly.


The first thing you may notice is that any application you use, e.g., IntelliJ, that require Java does not work.

Let's fix that by updating Java for El Capitan.

Go Here and click the Download button

Java for OS X 2015-001

After the javaforosx.dmg file is downloaded into your ~/Downloads folder, just double click it from the Finder and follow all the prompts to install it.

Update Java

Next, you'll want to go to your system preferences and update Java:

Now, you can verify it and see where El Capitan puts Java:


If you're like me and use jenv, you'll need to update that, too.

Note: jEnv is a command line tool to help you forget how to set the JAVA_HOME environment variable.

brew info jenv - Error

If you get an error like the following when attempting to run brew info...

$ brew info jenv
jenv: stable 0.4.3 (bottled), HEAD
Manage your Java environment
/usr/local/Cellar/jenv/20130917 (62 files, 260K) *
  Built from source
==> Caveats
To enable shims and autocompletion add to your profile:
  if which jenv > /dev/null; then eval "$(jenv init -)"; fi

To use Homebrew's directories rather than ~/.jenv add to your profile:
  export JENV_ROOT=/usr/local/opt/jenv
~ $ brew update
error: unable to unlink old '.travis.yml' (Permission denied)
Error: Failure while executing: git pull --ff --no-rebase --quiet origin refs/heads/master:refs/remotes/origin/master

Fix Permissions

... then you should fix the permissions in the /usr/local directory:

sudo chown $(whoami):admin /usr/local && sudo chown -R $(whoami):admin /usr/local

Note that the previous commands will fix a good number of other homebrew issues.

Update jenv shim

You'll want to update line 21 in $HOME/.jenv/shims/java

For me, I had to change this...

exec "/usr/local/Cellar/jenv/20130917/libexec/libexec/jenv" exec "$program" "$@" this:

exec "/usr/local/Cellar/jenv/0.4.3/libexec/libexec/jenv" exec "$program" "$@"

Set Global jEnv Java Version

If you see this...

$ jenv versions
jenv: version `oracle64-' is not installed

... then, run this:

$ jenv global oracle64-

$ jenv versions
* oracle64- (set by /Users/lex/.jenv/version)

Check Java Version

Finally, you can verify that Java is now happy:

$ java -version
java version "1.8.0_31"
Java(TM) SE Runtime Environment (build 1.8.0_31-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.31-b07, mixed mode)


Homebrew needs some love, too.

Just keep running brew update and doing what it says until you see this:

$ brew update
Already up-to-date.

Then, run brew doctor to be sure you're all good with brew:

$ brew doctor
Your system is ready to brew.


Java for OS X 2015-001 jEnv

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Wednesday, September 23, 2015

Mac iCloud and Spontaneous System Reboots

The other day when rebooting from a system update I was presented with the option of using an iCloud account, instead of my normal system login account.

I figured, why not? What's the worst that can happen?

What did happen was unpredictable spontaneous system reboots. (Over 6 in one day).

I studied the syslog messages for what could be happening (and there were a lot to scan through).

There were a lot of messages pertaining to iCloud resources and authentication.

I reverted to my normal system login account and disabled the iCloud hooks that I could find.

It's been over 60 hours and I have experienced 0 spontaneous system reboots since disabling iCloud on my Mac.

See screenshots below:

Do NOT choose iCloud

Uncheck Anything iCloud Related

Uncheck Anything iCloud Related


  • I do not appreciate Apple's attempt to force me to use their paid iCloud services.
  • If Apple continues down this road, I will dump my Mac OSX systems and develop soly on Linux-based systems
  • I don't think I'm alone in this. I am a developer. I paid good money for my Mac system and use it for development and don't have time for these marketing hassles.

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Monday, September 14, 2015

Regex - Contains THIS String But Not THAT String

Here's a regular expression that will return lines that have THIS string but not THAT string:



This could be useful if you are searching for tables that do not have the id attribute:


This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Friday, August 28, 2015

Difference between Map and Filter


  • You can prettify your JSON output by passing null and the number of spaces to indent.
  • Another useful high order function you can use is reduce which combines the elements of an array to a single value.
  • The logic that each function (map, filter reduce) calls to manipulate the original array of data is called a closure. Note that there is no formal function declaration and no function name for a closure.

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Wednesday, August 12, 2015

Revert Commit into a Branch

Suppose you accidentally push file changes in the form of a git commit to your master branch, but you intended to first create a branch of those changes and push the branch.

Here's how to fix that....

git reset HEAD~1 --hard                                        # move the head back before your commit
git push -f                                                    # make git accept it
git fetch                                                      # get latest 
git reset origin/master                                        # undo non-pushed commit
git checkout -b NEW_BRANCH_NAME                                # create new branch for file changes
git stash save 'moving committed changes to NEW_BRANCH_NAME'   # workstation looks like it did before you made changes
git cherry-pick COMMIT_HASH                                    # grab your changes via cherry-pick
git push -u origin NEW_BRANCH_NAME                             # push NEW_BRANCH_NAME with changes to remote repo

You will probably want to run make note of the git hash created by the original commit BEFORE running these commands. You can run the following to get that hash value: $ git log

This assumes you want to push your changes into a NEW_BRANCH_NAME, which will likely get merged into the master branch after a pull request and code review.

Assuming this works, you should delete the stash you made using $ git stash drop


Is it possible to retroactively turn a set of commits into a branch?
Squash my last X commits together using Git

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Friday, July 17, 2015

Regex for (not

I had a failing test that indicated that I had some sample code in my source.

So, I needed a nifty regex to find it.

Failing Test

A test failed with the following message:

Authenticate User and get his Role
cookies: [ 'name=undefined;; Path=/admin; Secure',
  'session=7bi4j6U2VYleeAC_kLseiA.IBKutIFH6iuag66-hrWwyVf175J6NaJICEkgMLC1gl9OrWpvpNTpv-SQ3QJCJe_VfB4MBwjIkpwNgwEM8R99qp6qNm0CXYqbdjaq6_R7PB-O2Vm-cFavjZEohzkNVVnYjlu3BDWFU17y4ENZaMNADXiZ150Pf_nGvdoVZmNFiZh2ysiIk0eRmSOiLEJtkWyj84btuBew0ylUKLn0ywRlnFBllKm4X8_GrCaWxRCFG6iS9T76r_X9PDb9BQKC6eZB2hQRKsykidJ3OY-G5PC_GJwS_LGlgYwP25-0BP8V1524LCvEZ3w5qZBX2kmCxrpwVA4ycls1F4fz3XSDLCyOnxO9rYpP2JYwjfYhkgV71-JNBogeVdVDL_JqWDTjZJZLDPrOp4ZmESrh6kI3n_f6zuxaWvuYK-c31_icvWm_g1eXmpKo4CVB0-Vv6EYuDuh5tS9y4yybr9mMrnZaSwHpyTp6YBsd9i0H6cyHZ7YiyIQCVa30We0iTa335arKG2zG.1437151277357.3600000.UePJmiwfcRyceI0eQ32BbXyekHbzvQ1tk5tP2_8O7T8; path=/; expires=Fri, 17 Jul 2015 17:41:18 GMT; httponly' ]

Searching for ""

Searching for "" yielded too many results because there were a lot of "" strings found.

Magic Regex

This regular expression did the trick:


It says, "Give me everything that matches "" but does not also contain "www".

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Wednesday, July 15, 2015

The Philosophy of Success at Work


Here are some related quotes from some notable people:

“To be is to do.” — Socrates
“To do is to be.” — Jean-Paul Sartre

“The way to be is to do.” — Dale Carnegie
“The way to do is to be.” — Leo-tzu, Chinese philosopher

“We act as though comfort and luxury were the chief requirements of life. All that we need to make us happy is something to be enthusiastic about.” —  Albert Einstein

“Success consists of going from failure to failure without loss of enthusiasm.” — Winston Churchill

“Nothing great was ever achieved without enthusiasm.” — Ralph Waldo Emerson

I think they are all correct.


You need to be in at least two of the circles above to stay employed.

You need to be in all three of the circles to thrive at work.

At the heart of success in your endeavors is your enthusiasm.

So, find what you truly enjoy doing, deliver good work and be nice and you will be successful at it.

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Wednesday, July 8, 2015

Upgrade Node.js to Avoid DoS Attack


If your are running Node.js v0.11.0 to v0.12.5 then you need to upgrade to v0.12.6 ASAP.

That's typically what happens with buffer exploits.

The Exploit

A bug in the way the V8 engine decodes UTF strings has been discovered. This impacts Node at the Buffer to UTF8 String conversion and can cause a process to crash. The security concern comes from the fact that a lot of data from outside of an application is delivered to Node via this mechanism which means that users can potentially deliver specially crafted input data that can cause an application to crash when it goes through this path. We know that most networking and filesystem operations are impacted as would be many user-land uses of Buffer to UTF8 String conversion.


Here's some background information on how buffers work in NodeJS.

Buffers are instances of the Buffer class in node, which is designed to handle raw binary data. Each buffer corresponds to some raw memory allocated outside V8. Buffers act somewhat like arrays of integers, but aren’t resizable and have a whole bunch of methods specifically for binary data. In addition, the “integers” in a buffer each represent a byte and so are limited to values from 0 to 255 (2^8 – 1), inclusive.

There are a few ways to create new buffers:

var buffer = new Buffer(8);

This buffer is uninitialized and contains 8 bytes.

var buffer = new Buffer([ 8, 6, 7, 5, 3, 0, 9]);

This initializes the buffer to the contents of this array. Keep in mind that the contents of the array are integers representing bytes.

var buffer = new Buffer("I'm a string!", "utf-8")

Writing to Buffers

Given that there is already a buffer created:

var buffer = new Buffer(16);

We can start writing strings to it:

buffer.write("Hello", "utf-8")

The first argument to buffer.write is the string to write to the buffer, and the second argument is the string encoding. It happens to default to utf-8 so this argument is extraneous.

buffer.write returned 5. This means that we wrote to five bytes of the buffer. The fact that the string “Hello” is also 5 characters long is coincidental, since each character just happened to be 8 bits apiece. This is useful if you want to complete the message:

buffer.write(" world!", 5, "utf-8")

When buffer.write has 3 arguments, the second argument indicates an offset, or the index of the buffer to start writing at.

Reading from Buffers

Probably the most common way to read buffers is to use the toString method, since many buffers contain text:

'Hello world!u0000�kt'

Again, the first argument is the encoding. In this case, it can be seen that not the entire buffer was used! Luckily, because we know how many bytes we’ve written to the buffer, we can simply add more arguments to “stringify” the slice that’s actually interesting:

buffer.toString("utf-8", 0, 12)
'Hello world!'

Using Buffers in the Browser

The Buffer exploit mainly affects backend server running NodeJS (or old versions of IO.JS), but the use of Buffers is not limited to the backend.

You can work also with buffers in the Browser by using:

However, its performance is poor, mainly due to Buffer design decisions.

Equivalent functionality, with better performance metrics, in the browser is provided by TypedArrays or


bops presents a JavaScript API for working with binary data that will work exactly the same in supported browsers and in node. due to the way that Buffer is implemented in node it is impossible to take code written against the Buffer API and make it work on top of binary data structures (Array Buffers and Typed Arrays) in the browser.

Instead, you have to fake the API on top of Object, but Object isn't designed for holding raw binary data and will be really slow/memory inefficient for many common binary use cases (parsing files, writing files, etc).

Upgrade NodeJS

If your target operating system is OSX, then you probably have 3 main packages to consider:
  • NodeJS
  • NPM
... and probably these as well:
  • Homebrew
  • NVM

If you're a Homebrew user and you installed node via Homebrew, there are issues with the way Homebrew and NPM work together stemming from the fact that both homebrew and npm are package management solutions.

If you're a Homebrew user and you installed node via Homebrew, there is a major philosophical issue with the way Homebrew and NPM work together.

There are many ways to install these packages.

Read this article for my suggested solution (that does not require you to use sudo permissions): Cleanly Install NVM, NodeJS and NPM.


This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Tuesday, May 19, 2015

Microsoft IE8 End Of Life

Microsoft recommends customers plan to migrate to one of the above supported operating systems and browser combinations by January 12, 2016.

IE8 Issues

HTML5 CSS3 Incompatibilities

Remember having to insert the following into your head tag to help fix the lack of support IE8 has for html5 tags and CSS3 properties? (or using Modernizer)

<!--[if lt IE 8]>
      <script src=""></script>
      <script src=""></script>

Missing functions

Remember having to use es5-shim because IE8 did not implement lastIndexOf, map, filter, every, forEach, etc. functions?

Security Vulnerabilities

Did you know the IE8 has over 500 known security vulnerabilities? (that will never get fixed)

IE8's continued reliance on ActiveX makes it vulnerable to the core.

Unforgiving Parser

Back in the days of IE5, IE was very forgiving when it came to HTML syntax.

IE8 is unforgiving in regards to HTML syntax and javascript.

I'm not saying that I approve of a lax enforcement of standards, but I do recall how quickly a web developer could crank out a web application when the user base were all IE users. Not so for IE8.

In many cases, IE8 would make your site break, even if it were coded perfectly.

Good bye, IE8. (and good riddance!)

p.s. Unbelievably, IE8 was actually somewhat better than IE7.


This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Wednesday, May 6, 2015

We're doing emergency maintenance to recover the site (

How many companies depend on

What is your mitigation strategy when things go wrong?

Today, github had a major outage.

Granted, github was only down for under 30 minutes or less, but that can still wreak havoc for scripts that depend on github and don't have 30 minute+ retries built in.

Between approx. 7:40 a.m. and 7:54 a.m. EST, if you were to try to reach any resource with in the url this is what you'd see:

GitHub Status


This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Friday, April 17, 2015

Where all Good Software Goes to Die


My local NodeJS build started breaking. It caused me heartache (and is no doubt affecting others). I learned that it was because some employees at Joyent--the company that took ownership of NodeJS--ticked off core NodeJS software engineers; They left to form a new truly open source software package called IOJS (that's a fork of NodeJS and now is far better than NodeJS).

Where does all good software go to die?

Corporations that put politics, political correctness, and profits ahead of creating great software.

Here are a few examples:
  • Oracle - MySQL
  • Oracle - OpenOffice
  • Joyent - NodeJS

There are no doubt many more, but these are the ones that percolate to the top of my mind.

Decline in Interest in MySQL

Interest peaked before Oracle acquired MySQL:

MySQL was left to the roadside, usually, since it was considered a useless appendage that prevents people from using the Oracle DB software. There were several community blunders (not making source public, not accepting patches, long-standing bugs with existing patches, etc) that forced MySQL guys to move to MariaDB. There was a big renaissance after the move, with many new features added and many bugs fixed. Sort of like the party when the house drops on the witch in the Wizard of Oz.
~ reddit

LibreOffice Forked from OpenOffice

Interest peaked before Oracle acquired OpenOffice:

OpenOffice. Oracle botched this so hard. No patches accepted, no timelines, no community communication. Oracle only paid attention to Fortune 500 contributors. Eventually, OpenOffice heads formed a foundation to start correcting some of these compounded issues. Oracle responded by kicking the members out of the project, telling them they couldn't use the OpenOffice trademark, etc. So all the experts left and formed LibreOffice. Another renaissance was had, and many long-standing issues were fixed. Code was maintained. The LibreOffice guys now regularly publish updates, statistics, reports, etc. It's a great example of how a professional FOSS project should be.
~ reddit

Number of NodeJS Releases

The following chart shows the number of stable NodeJS releases, per year:

The rest of this article will focus on NodeJS.


The number of stable releases of a software package is a good indication of its health.

It's clear that NodeJS should be in the ER. STAT.

As with other open source projects, a decline in the number of stable releases immediately precedes a major decline in public interest in it.

Common Thread

Mostly poor management decisions caused the best software development talent to leave the project, which directly related to the decline in that software's quality, interest and significance in the industry.

Joyent Calls Prolific NodeJS Contributor an "Asshole"

I'm not making this up. Seriously, I'm not.

Read it for yourself here.

First, it would help to understand how software development works using the Git Workflow that NodeJS was using.

I explained the pertinent part of it in this snippet from this post.
  • Developer creates feature branch, commits file changes and then submits a Pull Request
  • Other developers are notified of Pull Request, perform code review and the last one merges the feature branch to master

  Here's what happened:
  1. A code reviewer noticed that Ben Noordhuis wrote the pronoun, "him", a few times instead of "him/her" or "them", in an inline comment that described part of the NodeJS logic and submitted a Pull Request (PR) to change the pronouns. See patch here.
  2. Ben rejected the PR, providing this comment: "Sorry, not interested in trivial changes like that."
  3. A shit storm of bullying comments ensued from, "...always assumed to be male first on the internet. I'm +1 on this documentation change." to the more direct, "Stop pissing around and merge the damn PR."
  4. Some other contributor undeleted the trivial pronoun-changing PR and force pushed it.
  5. A stream of praise was given to that committer that pushed the PR to replace "him" with "them", e.g., "I believe these kinds of things do make a difference. Same for speakers, presenters, organisers etc. at events making an effort to e.g. switch between gendered pronouns (because yes, for many this is indeed still an effort, and probably even more so for non-native speakers of English, who are often not so aware of the finer points of the language or "accepted" alternative ways to express things). I'm always happy when someone does this!"
  6. Ben left the NodeJS community to help form IO.JS (an improved version of NodeJS) and is back to being highly productive.
  7. Joyent calls Ben an "asshole".


Cyberbullying is a global term that means the harassment of someone by use of electronic media, usually but not always social media.

The Thread That Took Down NodeJS

Whoever said NodeJS was fault tollerant was wrong.

Can you find any technical merit in any of the above (un-edited) comments?

Does that sort of dialog belong in a source code repository?

How does any of that help Joyent sell more NodeJS services?

There is a clear lack of vision from the technical management team at Joyent.

Here it is in it's entirety:

Joyent's behavior (lack of leadership/poor management practices) has replaced NodeJS core contributors with individuals that are obviiously more interested in the proper use of pronouns in comments than improving what matters, the NodeJS software.

Is NodeJS doomed to the same fate as other similar, significant open source software products?

Joyent and the Future of NodeJS

Currently, it's not clear what will become of NodeJS.

NodeJS' corporate owner, Joyent, is apparently still at awe with its "progessive views" as it continues to publish an article that calls one of NodeJS' most talented contributors an "asshole".

Technically minded, merit-based software engineers are going to have a hard time getting behind a company that pushes its social agenda ahead of software development.

The vast majority of NodeJS' core developers left the NodeJS community to form IO.JS

Joyent is making the appearance of mending fences, but time will tell ...

You can read the active dialog in the Reconciliation Proposal thread.

But the task force may find it difficult to reconcile with reasoning like the following from the IOJS community:

i'd rather not reconcile. the benefits are not substantial, and i'm very happy with how iojs has been run. i don't want iojs to change organizationally in the name of reconciliation. for me, iojs' organization is an ultimatum. i don't really care about naming and recognition. i'd rather just start pushing #!/usr/bin/env iojs and iojs-only (specifically, ES6+) support everywhere.

MIT License

There is a big difference. Node is MIT. And other companies with power and interest in node could simply fork if Joyent were to act foolishly. ~ Tim Caswell

For details, see Joyent & Node


One of the major components in most of my current front-end application architectures is JSDOM.

JSDOM is the first component in my stack to formally declare a divergence from NodeJS.

Here's what it says at the top of their README:

Note that as of our 4.0.0 release, jsdom no longer works with Node.js™, and instead requires io.js. You are still welcome to install a release in the 3.x series if you use Node.js™.

Available ES6 features in IO.JS

The following list of features are available without using any flags:

  • Block scoping (let, const)
  • Collections (Map, WeakMap, Set, WeakSet)
  • Generators
  • Binary and Octal literals
  • Promises
  • New String methods
  • Symbols
  • Template strings

Those ES6 features are very important. I'm sure I'll blog more about them in the future.

When the other NodeJS dependencies (besides JSDOM) support IO.JS I'll jump ship.

Personal Value Driven Decisions

I chose to move from MySQL to PostgreSQL

I chose to drop OpenOffice in favor of LibreOffice

I will very likely drop NodeJS in favor of IO.JS  

The NodeJS Debacle - Lessons Learned for CTOs

Here are some suggestions, if implemented, could help with some issues of software development, socially aware employees and profitability:
  • Retain your real talent
  • Streamline software development governance
  • Do not allow the political correctness dept, finance dept., etc. to make decisions that impact software quality
  • Keep politics and social agendas out of your Git Workflow
  • Create an internal social media site for your writers and comment editors to discuss their non-technical beliefs
  • Hire a management team that can reconcile all time spent (X) to the question, "Does X help us sell more product?"

When you perceive that one of your software vendors is offering more and more discounts (frequently in the form of a recurring revenue scheme), you should look into the technical viability of the product being pushed.

If there is another open source alternative that has sprung out of the discontent of the lead developers of the software you're considering (or have been sold), beware.

Personal Opinion

I think that Joyent will continue to take the lead in gender-neutral-pronoun political correctness.

NodeJS may have the most politically correct, properly conjugated documentation, but that's not important to me.

What is important is being able to rely on the current and future stability of my software platform.

I bet that the interest curve and hence the viability of NodeJS is going to take a much steeper dive into insignificance than either MySQL or OpenOffice.

I personally applaud Ben for his professionalism and technical and social contributions.

I see the IOJS / NodeJS situation as a David / Goliath story.

It's only a matter of time before Joyent and its flagship product, NodeJS, fall to the feet of IOJS.

Reality and Drama

Money talks.

So, here's what I think is going to happen:
  1. Joyent financial backers will soon understand why their cash cow is dying
  2. Joyent's management team will replaced
  3. The real talent(s) behind NodeJS (doing IOJS development) will be offered a deal and we may soon see an announcement like, "Joyent's core business (cloud computing) aligns well with a free and open IO.js."
Will the IOJS talent(s) take the hush money and be good boys or retain their dignity?


Share this article

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Automatic Semicolon Insertion in Javascript

You can write (mostly) semi-colon-less Javascript code. See example below.

However, there is a significant performance impact for doing so. See example below.

ASI performance is abysmal

A bug was created on 2013-01-24 and assigned to nobody. See Bug 107901: ASI performance is abysmal


IMHO - Perhaps, one day ASI performance will become a priority and the hit won't be significant, but until then keep using semicolons.

Share this article

Share this article

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Wednesday, April 15, 2015

Cleanly Install NVM, NODE and NPM

Today, while I was installing a debugger for Node the install failed.

That made me take a long look at how I installed node, npm and nvm.

Here's what I decided:

Node comes with it's own package manager. It's called npm.

Brew is an awesome package manager for software you install on a mac, but should not be used to install node.

I had a bunch of cruft left over from previous installs of node, npm and nvm; Some of which was getting in the way of cleanly installing node, npm and nvm.

So, I wrote a bash script that does the following:
  • Remove previous installation cruft
  • Use brew to install nvm
  • Use nvm to install node (with also installs npm)

Here it is:


# filename:  install-nvm-npm-node
# author:    Lex Sheehan
# purpose:   To cleanly install NVM, NODE and NPM
# dependencies:  brew

NOW=$(date +%x\ %H:%M:%S)
REV=$(tput rev)
OFF=$(tput sgr0)
MY_NAME=$(basename $0)
if [ "$NODE_VER_TO_INSTALL" == "" ]; then
if [ "`echo "$NODE_VER_TO_INSTALL" | cut -c1-1`" != "v" ]; then
    echo """$CR""Usage:   $ $MY_NAME "
    echo "Example: $ $MY_NAME v0.12.1"
    echo "Example: $ $MY_NAME $CR"
    exit 1
echo """$CR""First, run:  $ brew update"
echo "Likely, you'll need to do what it suggests."
echo "Likely, you'll need to run: $ brew update$CR"
echo "To install latest node version, run the following command to get the latest version:  $ nvm ls-remote"
echo "... and pass the version number you want as the only param to $MY_NAME. $CR"
echo "Are you ready to install the latest version of nvm and npm and node version $NODE_VER_TO_INSTALL ?$CR"
echo "Press CTL+C to exit --or-- Enter to continue..."
read x

echo """$REV""Uninstalling nvm...$CR$OFF"
# Making backups, but in all likelyhood you'll just reinstall them (and won't need these backups)
if [ ! -d "$BACKUP_DIR" ]; then 
    echo "Creating directory to store $HOME/.nvm .npm and .bower cache backups: $BACKUP_DIR"
    mkdir -p $BACKUP_DIR
set -x
mv $HOME/.nvm   $BACKUP_DIR  2>/dev/null
mv $HOME/.npm   $BACKUP_DIR  2>/dev/null
mv $HOME/.bower $BACKUP_DIR  2>/dev/null
{ set +x; } &>/dev/null

echo "$REV""$CR""Uninstalling node...$CR$OFF"
echo "Enter your password to remove user some node-related /usr/local directories"
set -x
sudo rm -rf /usr/local/lib/node_modules
rm -rf /usr/local/lib/node
rm -rf /usr/local/include/node
rm -rf /usr/local/include/node_modules
rm /usr/local/bin/npm
rm /usr/local/lib/dtrace/node.d
rm -rf $HOME/.node
rm -rf $HOME/.node-gyp
rm /opt/local/bin/node
rm /opt/local/include/node
rm -rf /opt/local/lib/node_modules
rm -rf /usr/local/Cellar/nvm
brew uninstall node 2>/dev/null
{ set +x; } &>/dev/null

echo "$REV""$CR""Installing nvm...$CR$OFF"

echo "++brew install nvm"
brew install nvm 
echo '$(brew --prefix nvm)/'
source $(brew --prefix nvm)/

echo "$REV""$CR""Insert the following line in your startup script (ex: $HOME/.bashrc):$CR$OFF"
echo "export NVM_DIR=\"\$(brew --prefix nvm)\"; [ -s \"\$NVM_DIR/\" ] && . \"\$NVM_DIR/\"$CR"
NVM_DIR="$(brew --prefix nvm)"

echo """$CR""Using nvm install node...$CR"
echo "++ nvm install $NODE_VER_TO_INSTALL"
nvm install $NODE_VER_TO_INSTALL
NODE_BINARY_PATH="`find /usr/local/Cellar/nvm -name node -type d|head -n 1`/$NODE_VER_TO_INSTALL/bin"
echo "$REV""$CR""Insert the following line in your startup script (ex: $HOME/.bashrc) and then restart your shell:$CR$OFF"
echo "export PATH=\$PATH:$NODE_BINARY_PATH:$HOME/.node/bin"

echo """$CR""Upgrading npm...$CR"
echo '++ install -g npm@latest'
npm install -g npm@latest
{ set +x; } &>/dev/null
echo "$REV""$CR""Insert following line in your $HOME/.npmrc file:$OFF"
echo """$CR""prefix=$HOME/.node$CR"
echo "Now, all is likley well if you can run the following without errors:  npm install -g grunt-cli$CR"
echo "Other recommended global installs: bower, gulp, yo, node-inspector$CR"

Gist for install-nvm-npm-node (that I might update)

Share this article

This work is licensed under the Creative Commons Attribution 3.0 Unported License.

Thursday, March 19, 2015

Simple fix for Module did not self-register

If you see the following error ...

Module did not self-register Error

[18:28:09] Error in plugin 'gulp-mocha'
    Module did not self-register.
Error: Module did not self-register.
    at Error (native)
    at Module.load (module.js:355:32)
    at Function.Module._load (module.js:310:12)
    at Module.require (module.js:365:17)
    at require (module.js:384:17)
    at bindings (/Users/lex/myproject/node_modules/jsdom/node_modules/contextify/node_modules/bindings/bindings.js:76:44)
    at Object. (/Users/lex/myproject/node_modules/jsdom/node_modules/contextify/lib/contextify.js:1:96)
    at Module._compile (module.js:460:26)
    at Module._extensions..js (module.js:478:10)
    at Object.require.extensions.(anonymous function) [as .js] (/Users/lex/myproject/node_modules/6to5/lib/6to5/api/register/node.js:113:7)
[18:28:10] Finished 'scripts' after 21 s

Simple fix

... simply remove the node_modules directory and reinstall your npm modules.

/Users/lex/myproject/ $ rm -rf node_modules/
/Users/lex/myproject/ $ npm install


Share this article

This work is licensed under the Creative Commons Attribution 3.0 Unported License.