Skip to content

So what’s wrong with 1975 programming?

Notes from the Architect picks up Varnish as his application example to highlight that things have changed since then. You can extrapolate from his examples to, say, Python or Java versus BASIC or C.

I have spent many years working on the FreeBSD kernel, and only rarely did I venture into userland programming, but when I had occation to do so, I invariably found that people programmed like it was still 1975.

So what’s wrong with 1975 programming? The really short answer is that computers do not have two kinds of storage any more.

And people program this way.

They have variables in “memory” and move data to and from “disk”.

Take Squid for instance, a 1975 program if I ever saw one: You tell it how much RAM it can use and how much disk it can use. It will then spend inordinate amounts of time keeping track of what HTTP objects are in RAM and which are on disk and it will move them forth and back depending on traffic patterns.

Well, today computers really only have one kind of storage, and it is usually some sort of disk, the operating system and the virtual memory management hardware has converted the RAM to a cache for the disk storage.

In other words, a critical part of 1975 programming was about memory management and temporary data storage on the machine. In modern applications programming, this is best left to the operating system. This is why garbage collection is such a big deal in modern programming languages. Application memory management in applications is a major source of bugs and can provide an access point for malicious code injection.

Of course, 1975 programming isn’t dead. A microcontroller is much like a 1975 computer in terms of memory management needs and there isn’t an OS to handle everything as the program itself directly deals with hardware. So there’s nothing wrong with 1975 programming. Like all software efforts and any other craft, tools and methods all have their place and a good craftsman chooses the right tools and methods for the task at hand.

Tech directions: IkaScope

They call it the IkaScope. “Robust and reliable analog front end, Measuring signals with IkaScope is quick, easy and intuitive. Sampling rate 200 MSPS; Bandwidth 25 MHz; Memory 4000 pts; Input range ± 40V; Coupling AC / DC; Refresh rate 200 FPS*.”

At this time, it’s a gleam in the designer’s eye so who knows what will eventually show up. The concept as described is feasible and illustrates a number of ideas about where instrumentation is going.

Wireless is more than just convenience. It provides electrical and functional isolation. This wireless uses a standard Wifi network and provides an access point if need be. Now, if the display and control used standard HTML – i.e. the device served up a standard web page rather than requiring some proprietary protocol, the only custom software would be bundled with the custom hardware.

The portability expression is also telling. This is an oscilloscope in a probe form factor. The level of integration not only provides capabilities in small form factors, it can also provide cost reductions.

For speculation, consider how the IkaScope idea might be expanded to multiple channels. How might a collection of independent probes coordinate their measurements? Then there’s signal processing, data analysis, other measuring needs, and … just where can it go?

Doors open. Ideas flood in. Interesting times.

1700’s tech: A piano story

It may not have been crafting nanocircuits on silicon but innovation and craftsmanship have legs in history. Here’s a good story of that: What Does the World Oldest Surviving Piano Sound Like?: Watch Pianist Give a Performance on a 1720 Cristofori Piano.

Imagine your favorite works for the piano—the delicate and haunting, the thundering and powerful. … Now imagine all of it never existing. A giant hole opens up in world culture. … The piano seems inevitable when we look back into music history. Its immediate predecessors, the clavichord and harpsichord, so resemble the modern piano that they must have evolved in just such a way, we think. But it needn’t have been so.

Other makers tried different mechanisms, but “Cristofori was an artful inventor,” the Met remarks, “creating such a sophisticated action for his pianos that, at the instrument’s inception, he solved many of the technical problems that continued to puzzle other piano designers for the next seventy-five years of its evolution.” These designers made shortcuts, since Cristofori’s “action was highly complex and thus expensive.” But nothing matched his design, and those features were “gradually reinvented and reincorporated in later decades.”

Though Baroque composers at the time, including Johann Sebastian Bach, “were aware of it,” most, like Bach, harbored doubts. “It was only with the compositions of Haydn and Mozart” decades later “that the piano found a firm place in music.” A place so firm, it’s nearly impossible to imagine the last 250 years of music without it.

The problem with earlier keyboard instruments was musician control of the loudness of notes played. That is the problem that the invention of the piano solved.

Besides dynamic range, there is a lot of other invention and innovation in the piano from soundboards and concert amplification capabilities to the user interface. A history of music technology teaches many lessons.

Seek and ye shall find – that it’s a harder journey than you ever imagined.

Luboš Motl takes off on Nautilus’ disillusioned ex-physicist. “Bob Henderson wrote an autobiography for Nautil.Us” and that provided the ammunition for a rant on Life’s Journey that any college freshman with ambition should read carefully.

This is a sketch of the “path towards the deep laws of the Universe” that I already had in mind when I was 4 years old or so – and I think that other physicists who don’t relate to Henderson’s complaints would tell you something similar. Henderson is telling us that he was gradually discovering some of these things during his grad school years. One actually has to work hard at some moment, be materially modest, be confused much of the time, and try many paths that don’t lead to interesting outcomes, while the greatest discovery in a century arrives relatively rarely (approximately once a century, if you want to know).

Those are shocking facts!

You should have known it before you entered the graduate school.

In the college but also in the later years, I was talking to lots of people who begged for recommendations like that. What should I do not to get lost? My answer was never so direct but yes, my current answer would be: If you need this leadership repeatedly, just quit it. If you don’t know what you’re doing, why you’re doing it, and where you are going, and how you may roughly get there, then it’s a bad idea to start or continue the journey. People who are picking an occupation should feel some “internal drive” and they should have at least a vague idea what they’re doing, why, and how. Again, I don’t think that this common sense only holds in theoretical physics. Theoretical physics only differs by the deeper caves in which one may get lost – because deeper caves are being discovered or built by theoretical physicists, too.

There are more “shocking facts” from a ‘been there, done that’ source. When Motl gets on a rant, he can spin off some very useful information and ideas and fodder for introspection and analysis. Heave to and take a gander.

Tools and how they corrupt the mind. Programming languages, thinking, and doing

Keep in mind that it is the craftsman and not the tools that solve problems and create solutions and that tools are the product of craftsman, too. Tools not only enable and facilitate, they also communicate. They are one craftsman sharing ideas with another. How you can do something influences how you think about doing things and that is the corruption of the mind in a process called learning. 

FORTRAN. Formula Translation, was a first step out of machine programming by twiddling bits in registers. The programming language made it easier for mathematicians to tackle the process of solving an algebraic problem with a computer. Much like the bit twiddling was simply a concrete expression of Boolean algebra, FORTRAN was a standard mathematical process with some glue and grease to be able to ‘read’ input data and present results. Know algebra and you pretty much had computer programming handled. This has been true for most mainstream programming over the years with a few oddballs thrown in now and then by revolutionaries (think Lisp or SmallTalk).

The fact was that differences were mostly glue and grease. FORTRAN, BASIC, C, Pascal, and even Assembly with macros all worked in pretty much the same way. Abstractions were limited and the program was all built on expressions of algebraic nature without much nuance. The first evolutionary step was in the bundling. This is where macros and libraries come in. Further down this line was the concept of inheritance and object oriented programming. That led to Java, C++ and others that implemented these ideas in program syntax but, still, these languages didn’t really need much mental corruption to understand and use.

Another evolutionary path was that of the mathematical purists. That can be seen on the war on the ‘Go To’ command and the development of control structures that hid the jumping around in loops or decisions. Another outcome of that is called functional programming where called procedures operate in a carefully delineated manner to assure data cleanliness. Forth might be considered as a purist approach to a stack based algebra, too.

Python is an example of trying to implement many concepts and ideas learned between 1950 and 1990 without straying too far from the basic algebraic language. It has done rather better than many other similar efforts and that may be due to supporting the old style while still accommodating new ideas in a way that appealed to enough of the right sort of people to build a community and a wealth of resources. Python supports object oriented structures, but you don’t have to use them. It supports functional programming but you don’t have to use that, either. It supports several abstract methods of bundling that simplify expression but boggle the mind and you don’t have to use them. 

A lot of this is what is called “syntactic sugar.” That’s easy to learn. When functional capability starts to meld with this syntax, the paradigm changes and that is more difficult. Finding out what has been bundled for easy access to previous work is even more difficult but this can be done with a bit of research. Assembling the bits and pieces can be a real challenge which you can see if you look at the make file for a significant project or at all of the options supported by a modern compiler.

So, as always, the choice of tool boils down to what do you want to do. Do you want to learn or to teach or to illustrate or to document? What’s the target platform? What are the limitations and constraints imposed by the platform, your resources, and needs? Do you want a public or a private solution? 

Now its time to dig in and get something done. That’s the best way to figure out what to do next.

 

Federico Faggin on his early work

A YouTube presentation: Federico Faggin at UC Berkeley 2-19-2014 “Microelectronics & Microprocessors: The Early Years” (1:21)

This is one of those guys at the very top of the ‘been there, built it’ pyramid telling his story. He was the right engineer in the right place at the right time and the result was the modern microcomputer and solid state fabrication of electronic circuits.

In answering a question after the presentation, he gets into his idea of the conscious and identity and gestalt. Watch the video and get a glimpse of the person.

Linux reinstalled: samples from 1993 – 2003

How Linux got to be Linux: Test driving 1993-2003 distros — “Enjoy a trip down Linux memory lane as we take early distros for a spin.”
Posted 20 Dec 2016 Seth Kenlon.

By the early 2000s, it’s clear that Linux has well and truly hit its stride. The desktop is more polished than ever, the applications available want for nothing, the installation is easier and more efficient than other operating systems. In fact, from the early 2000s onward, the relationship between the user and the system is firmly established and remains basically unchanged even today. There are some changes, and of course several updates and improvements and a staggering amount of innovation.

He’s using QEMU as a virtual machine. Seth starts with Slackware 1.01 (1993) using an image he found rather than installing from floppies. He did end up with a floppy install for Debian 0.91 (1994). Remember those days? The graphics configuration headaches are noted as well. This goes from monitor timings to X configuration settings and GUI presentations.

Read the essay and get a feel for what it was like for the pioneers.

Matzobrei

There have been a few essays on the meaning of Christmas with ideas to consider. Here’s one: Incomprehensible Word, Uncomprehending World: The Puzzle of Christmas by N.T. Wright.

A Jewish word spoken to an uncomprehending world; a child’s word spoken to uncomprehending adults; a word for a food of which others were unaware – 

Christmas is not about the living God coming to tell us everything’s all right. John’s Gospel isn’t about Jesus speaking the truth and everyone saying “Of course! Why didn’t we realize it before?” It is about God shining his clear, bright torch into the darkness of our world, our lives, our hearts, our imaginations – and the darkness not comprehending it. It’s about God, God as a little child, speaking words of truth, and nobody knowing what he’s talking about.

You may be aware of that puzzlement, that incomprehension, that sense of a word being spoken which seems like it ought to mean something but which remains opaque to you.

Don’t imagine that the world divides naturally into those who can understand what Jesus is saying and those who can’t. By ourselves, none of us can. Jesus was born into a world where everyone was deaf and blind to him. But some, in fear and trembling, have allowed his words to challenge, rescue, heal and transform them. That is what’s offered at Christmas, not a better-focused religion for those who already like that sort of thing, but a Word which is incomprehensible in our language but which, when we learn to hear, understand and believe it, will transform our whole selves with its judgment and mercy.

Listening. Understanding. Dissonance. Puzzlement. Incomprehension. Confusion. “challenge, rescue, heal and transform.” Christmas.

Linux growth in 2016 – developer’s numbers

Phoronix picks up on the latest Linux kernal release and takes a look at how much it has changed in 2016. Linux 4.10-rc1 Gained 488k Lines, Kernel Up 1.9+ Million Lines For 2016.

For those wondering how much weight the kernel gained in 2016, comparing the 4.10-rc1 kernel to 4.4-rc8 (released on 3 January 2016, basically the start of the year) the kernel tacked on nearly two million lines — of code, documentation, and other changes. For 2016 we’re at 33,286 files changed in the kernel tree yielding 4,168,283 insertions and 2,195,354 deletions.

As of this morning it puts the overall kernel tree at 57,202 files consisting of 22,833,860 lines. Keep in mind that’s not just 22.8M lines of pure code but also code comments, documentation, the in-tree tools, Kconfig, etc. But by any measure, the Linux kernel remains a huge and only growing project.

The Git tree history overall shows a total of 647,845 commits from 16,255 authors. For 2016 we are at 72,828 commits this year, down from last year’s 75,631 commits or the 75,613 commits in 2014.

22 million lines of stuff is a lot of computer program. 2/3 of a million “commits” or developer contributions indicates that there are lot of volunteers actively ingaged in the project. This is the 25th year for the project and it has become a phenomena with many paradigms.

History of the 8008 (caution: pictures of a naked cpu!)

As you can see, the 1970s were a time of large changes in semiconductor chip technology. The 4004 and 8008 were created when the technological capability intersected with the right market.”

Ken Shirriff shows how he got Die photos and analysis of the revolutionary 8008 microprocessor, 45 years old.

This chip, Intel’s first 8-bit microprocessor, is the ancestor of the x86 processor family that you may be using right now. I couldn’t find good die photos of the 8008, so I opened one up and took some detailed photographs. These new die photos are in this article, along with a discussion of the 8008’s internal design.

The 8008’s complicated story starts with the Datapoint 2200, a popular computer introduced in 1970 as a programmable terminal. (Some people consider the Datapoint 2200 to be the first personal computer.) Rather than using a microprocessor, the Datapoint 2200 contained a board-sized CPU build from individual TTL chips. (This was the standard way to build a CPU in the minicomputer era.) Datapoint and Intel decided that it would be possible to replace this board with a single MOS chip, and Intel started the 8008 project to build this chip. A bit later, Texas Instruments also agreed to build a single-chip processor for Datapoint. Both chips were designed to be compatible with the Datapoint 2200’s 8-bit instruction set and architecture.

The Datapoint 2200’s architecture was used in the TMC 1795, the Intel 8008, and the next version Datapoint 220011. Thus, four entirely different processors were built using the Datapoint 2200’s instruction set and architecture. The Intel 8080 processor was a much-improved version of the 8008. It significantly extended the 8008’s instruction set and reordered the machine code instructions for efficiency. The 8008 was used in groundbreaking early microcomputers such as the Altair and the Imsai. After working on the 4004 and 8080, designers Federico Faggin and Masatoshi Shima left Intel to build the Zilog Z-80 microprocessor, which improved on the 8080 and became very popular.

A key innovation that made the 8008 practical was the self-aligned gate—a transistor using a gate of polysilicon rather than metal. Although this technology was invented by Fairchild and Bell Labs, it was Intel that pushed the technology ahead.

While the 8008 wasn’t the first microprocessor or even the first 8-bit microprocessor, it was truly revolutionary, triggering the microprocessor revolution and leading to the x86 architecture that dominates personal computers today.

The legacy of the Datapoint still exists in the Intel ‘little-endian’ architecture and parity flag being two examples cited. Those engineering decisions were in an era long gone. They have been subsumed in newer technologies and newer circuit and component innovation. It took 6 years (to 1978) from the 8008 to break into the personal computer era with the Apple II and the TRS 80 and the many other ‘cpu on a chip plus a whole lot of glue circuits on a board’ personal computers. 6 years after than (~1984) and the IBM PC tackled the business world and the 80286 showed up. 6 years after (~1990) that and megabyte working memory sizes showed up with the 80386 to manage it. Another 6 years (~1996) and the I’net started pushing. By 2002, the modern PC era was well under way with networking, sound, mutli-tasking operating systems, gigahertz clock speeds, gigabyte working memory sizes, 64 bit CPU’s, and on and on.

Now, even a cheap cell phone has a GHz clock speed, a gigabyte of working memory, and 8 GB of non-volatile memory. And a high resolution color display and many sensors and capabilities not seen in any consumer computer even ten years ago … not to mention the network infrastructure that supports it. And the cell phone is a common as dirt Christmas present, even for kids.

Merry Christmas!