Sooner or later, a software development team has to settle on a standard for the maximum line length for source code. They might even add it to a written coding standard. The thing is, that different people and teams don’t often agree on how long a line of code can be. It turns out that it depends.
How long is a line of code?
80 characters per line is the traditional length because, you know… punched cards. Even though the IBM card format dates from 1928 - so not yet antique - this remains enduringly popular, becoming further set in stone in the 1978 VT 100 DEC video terminal that became its own de facto standard. 80 characters per line is still going strong today, perhaps through little more than its own inertia, and perhaps programmers’ ironic conservatism. Fortunately, if you’re using vim, you can always set your window to 80 columns and write short lines of code:
132 characters per line is another traditional format - the other VT100 display mode, and related 132 column dot-matrix printers. 132 characters also seems to be the maximum line length in Fortran. In fact, this is probably about the longest line length in common use among programmers, although the Scala source code occasionally includes even longer lines.
120 characters per line sometimes emerges as a compromise, when longer lines seems like an outlandish decadence or, like 100 characters per line, a round number compromise.
72 characters per line is the longest line length that Manning allows in their books’ code listings. This is uncomfortably short, and wrapping lines to fit this was the least fun part of writing Play for Scala.
72 characters is also the default fixed line length in Fortran, which is just an uncomfortable language.
Not too long
If you want to know how long a line of code should be, given that it depends, then you need to think about when it’s too long and when it’s too short. I have compiled a list of ways to tell that a line of code is too long:
- You can’t read a whole line of code without truncation, scrolling or automatic wrapping.
And that is all. When it comes to what is too long, it depends on how big your monitor is and how big your editor window is on that monitor. This is very much a matter of choice: you can choose to code on an 11-inch MacBook air, work for a company that makes programmers use tiny square monitors, or fill your screen with lots of 80 column terminal windows. You don’t have to do those things, but you can if you want to.
Granted, I’m ignoring the potentially problem with being able to visually scan long lines of text, which is why newspaper columns are narrow. I think of a line of code is more like a whole paragraph of prose, so you don’t scan from one line to the next in the same way when you read, and has more internal structure than written language.
Not too short
The maximum line length can be too short as well, for various reasons.
- Code uses more vertical space, so you can see fewer lines at a time.
- Lines of code frequently require wrapping across two or more lines.
- The coding style tends towards short and abbreviated names, to avoid wrapping.
Probably the main reason for changing the maximum line length, in either direction, is to be able to view more code on the screen at once. Too long, and you can’t see the right hand side; too short and you can’t see all of the lines.
Line-wrapping can be even worse for vertical space use than just the extra wrapped lines. Sometimes, the line continuation indentation is not enough to make it visually clear enough where new statements start, so you have to choose between two bad options: less readable code, or losing even more vertical space to blank lines.
The worst consequence of short lines, however, is the tendency towards short names. The only thing that starting to maintain someone else’s legacy code is when it was all written in small terminal windows, and is full of 1-3 character variable names. When there isn’t enough screen space for longer lines, horizontally, or more lines, vertically, then you have to keep more of the code in your head. If you have space, then it is easier to read your code if you only use English words in code.
GitHub is the de facto coding standard
125 characters per line is the real de facto coding standard for maximum line length these days, because this is the maximum number of characters that you can see in the GitHub diff view. This used to be 119 characters, but the page layout changed.
How long is too long does not only depend on your code editor window, but all of the tools you use to view code. For some old-school programmers, this is only the terminal window, but most programmers now use more tools than just an IDE. Coding is a team sport, and now we have web-based collaboration tools.
Photo: Brad Hagan