We are in the midst of an unprecedented technology revolution. Technology has all but woven itself into the very fabric of society and into the lives of the individuals that comprise it. “Wearable technology” apparently is now the next big thing. What will likely come next after that will be implanted tech. As our relationship with technology becomes more and more intimate, we become less and less conscious of the very technology we depend on — like the air we breath and the blood that flows through our veins, it becomes invisible to our everyday consciousness and increasingly taken for granted. Does that trend, then, make our generation necessarily more ‘tech savvy’?
Gaurav Dhillon, founder of California-based data warehouse management company Informatica is optimistic about the so-called ‘tech savvy’ generation…
The fear of computers has, in fact, left the building. New generations of employees, people who graduated this millennium, my kids — 13 and 6. The Millennials are not afraid of computers — they [may] not be programmers, but they’re tech-savvy. We think of them as citizen integrators.
Whatever the concept “citizen integrators” means is anyone’s guess. For me, it means that we are increasingly becoming thecrowd upon which technology propagates itself. We’d like to think we are bringing technology into our lives. I think it is the other way around. Technology is bringing people into its network.
As tech advances, the majority of people do not get “savvier”. Rather, they become more oblivious to technology.
Back in the old days, because computers were ugly and unfriendly machines (not the “appliances” or “assistants” they are touted to be today), you needed to actually understand how a computer works to use one. There was not much you can do on a computer, in those days, unless you had, at the very least, some rudimentary knowledge of a programming language and, therefore, some idea of how a machine “thinks”.
That is what I consider being “tech savvy”.
Today, user interfaces (UIs) are so sophisticated that any ordinary schmoe could use a computer. Breakthrough after breakthrough in UI design has humanised these machines and have made their inner workings a virtual mystery to the average user. And the race is on to develop ever more powerful “virtual assistants” into mobile devices that could listen to and talk to its owners. No wonder we consider today’s youth “tech savvy”. Soon, even babies will be able to use computers. Then again, they already are. We are currently raising today’s babies and toddlers on a diet of digital iPad candy. The next generation will soon forget that these colourful friendly icons and the soothing voice of “virtual assistants” like Siri are no more than cleverly-programmed machines getting better and better at mimicking human behaviour.
If we believe communicating with a machine designed to naturally interface with humans can be considered “tech savvy”, then we may as well all consider ourselves brain surgeons.
We can’t all be brilliant coders, but we at least should gain a bit more perspective around what it means to be “tech savvy”. We need to look back to the original tech savvy generation of Apple co-founder Steve Wozniak who, in in the late 1960s and early- to mid-1970s, along with a generation of computer hobbyists that included Bill Gates, conceptualised and went on to develop the first rudimentary UI that would bring computers closer to the masses — the BASIC language. Back then, Woz considered it the “key” to what, at the time, was the masses’ only exposure to computers — video games…
The key to games was BASIC. Bill Gates was unknown except in the electronics hobby world. Everyone in our club knew that he’d written BASIC for the Intel microprocessor. I sniffed the wind and knew that the key to making my computer good (popular) was to include a high-level language and that it had to be BASIC. Engineers programming in FORTRAN were not going to be what would start a home computer revolution.
Through Woz, we are treated to a revisit of the sort of thinking that goes into developing the technology that would go on to deliver hours of entertainment to the subsequent generation of self-described “savvy” computer users…
With the Apple ][ I had the video and computer memory one and the same so that the microprocessor, changing maybe a million (exaggerated) numbers a second, would change a million screen bytes a second. Atari arcade games were hardware then but now the games could be implemented as software, using 6502 machine language programming. BASIC is an interpreted language. BASIC goes over the individual letters of each statement as it executes, determining what to do. It is maybe 100 or 1000 times slower than machine language as a result. But one day I was curious as to whether you could program moving objects just in BASIC and see them move like realistic animation.I had designed Breakout for Atari in hardware. I wondered if I could program this simple animated arcade game in BASIC? I knew I could program it in machine language. Since it was my own BASIC I went to the syntax chart and added commands to plot color and to draw horizontal and vertical lines. I then searched chip manuals and chose a chip with 4 timers (555 style timers) on one chip. I used that with some software to read paddle positions off potentiometers, dials that changed resistance according to where you turned the dial. Once I had these mechanisms installed (burning new EPROMS for the BASIC additions) I sat down and wrote some simple FOR loops to plot bricks in different colors. I must have tried 30 color combinations in just minutes. Then I added paddles and score and a ball. I could adjust program parameters to change the ball speeds and angles. By the way, I think this is when I added a speaker with 1-bit audio just because you needed sounds when a ball hit a brick, etc.
“Tech savvy” we now say we are? The Philippines, for example, prides itself as being host to one one of the planet’s biggest throng of “social media” users. But as far as being savvy enough to apply all that technology to building collective intelligence, well, that still remains to be seen.
No comments:
Post a Comment