Tag Archives: computers

“Spaghetti Code” as Computer Science Lingo

Main Piece

Informant: Spaghetti Code is exactly what it sounds like, usually, I’ll use Spaghetti Code when I need to get something working, and I don’t give a shit what it looks like. It usually works, but it breaks pretty easily It is completely unreadable but it gets the job done. 

Interviewer: In what context would you use this? 

Informant: Say I have a lab due tonight, and I have an hour to do it and I just need something to pass the cases so I just code something really half-assed and someone asked me if I did the assignment, I would tell them “yeah, I did it but it is all spaghetti.” 

Interviewer: Where did you learn it? 

Informant: Sophomore year through word of mouth, friends just kinda started using it around me so I picked it up. 


The informant is a good friend and housemate of mine, and is a junior at USC studying Computer Science and Computer Engineering. He is originally from Manhattan Beach, CA and has been coding ever since highschool. He has had several internships with different computer science companies such as Microsoft and is very involved with different coding clubs on campus. 


When I asked my informant how his assignment went, he described it using this term. Being something I have never heard before, I brought it up during our interview and asked him to describe it and provide some more context as to when he would use it. 


I think this example of folk speech is a very colloquial and humorous way for computer science students to describe their work and relate to one another. It is a great indicator of the quality of their code, and provides imagery that is usually not present within the lingo and world of computer science. Especially for a subject and major that can harbor a lot of stress, it is also an indicator that could have arised when students did not have the time or effort to put in quality work into their coding assignments but still needed a way to get it done. 

The Y2K Virus

Y2K stands for Year 2000. In the 90s, computers were still a relatively new addition to the daily lives of Americans, and few understood how they worked. As the new millennium approached, a rumor spread that because computers only handle dates in 2 digits, when the numbers changed to 00 at the turn of the millennium, all computers would shut down due to their inability to compute the date, resulting in an event imagined to be the opposite of what is now imagined to be the Singularity.

GT: one of my neighbors bought enough canned food and water to survive an apocalypse, like he literally blew his life savings on survival supplies to prepare for Y2K virus. He thought it’d be anarchy but the virus never happened.

In reality, the Y2K problem did actually exist, as computers only stored 2 digits for each date and 2000 was indistinguishable from 1900. The rollover from 99 to 00 caused logical problems due to the lack of a 3rd digit. However, this is entirely a programming issue, and most companies were able to upgrade their systems to avert a possible crisis before it occurred. In fact, on January 1, 2000, the main impact of the date glitch caused some malfunctions in data storage only after some programs started up, and other programs were unaffected. Some machines integral to life in developed countries malfunctioned and generated false results due to the bug, but the issue was contained quickly enough to avoid pandemonium.

The rumor that the problem would significantly affect real life was unfounded, but it did cause panic leading up to the event. Because people were only just beginning to rely heavily on machinery to operate through daily life, a fear of machinery quickly spun a minor problem way out of proportion, validating to people distrusting of the digital age that the machines would lead to the end of civilization.