Saturday, May 7, 2011

Preview: IPRO 315 Devblog

Now that finals are over, I'll be making a series of developer blog posts (devblogs) about the projects I worked on this semester and the things I learned working on them.

This post serves as a preview for a series of blog posts detailing the development of the fast food ordering application I developed for IPRO 315.

The primary goal of IPRO 315 was to increase the accuracy of orders placed at drive-through fast food locations. The project was essentially split into two parts:

1. Conversion of an ordering kiosk audio system developed by previous project teams from using a wire to connect the kiosk to the restaurant to a wireless system

2. An ordering application for mobile devices which would remove order errors due to audio fidelity issues entirely by allowing customers to see a textual summary of their order before submission.

Initially we anticipated parallel development of an iOS application and an Android application, both targeting a tablet form-factor for interface consistency. Unfortunately, as the sole software developer for this project I was unable to commit enough time to the project to make development of the Android application viable. Instead, I focused solely on development of the iOS application for iPad.

Later posts will detail various design considerations and things I learned about iOS development as a result of this project. For now, I'll post some screencaps of the application running in the XCode simulator.











Monday, May 2, 2011

Projects In, Finals Incoming

Last undergraduate finals/projects this week. Control Systems tomorrow. Parallel and Distributed Processing Friday.

Sunday, April 24, 2011

Short Story: The Luddites Were Right

I had to write a short story for a class on the topic of "Computers and Terrorism". This is the result.

The Luddites Were Right


I was considered an expert on the subject of, 'Computers and Terrorism', but that all changed recently. The irony was not lost on me as protester's cries of 'Luddite!' and 'Amish scum!' rang through the crisp evening air outside my window. It was the eve of The Continuity, that fabled day in which man became immortal, free from the needs and weaknesses of flesh. Raymond Kurzweil had been championing the idea for years. I suppose it turns out he was right, though wherever he is now, I'm quite sure he wasn't expecting this.

In the years leading up to that fateful day, I had frequently, and publicly, clashed with the Transhumanist movement. I had argued that we didn't yet understand enough about the technologies underlying this momentous transition. The supreme arrogance of the inventor, that he, and he alone, truly understands the impact and implications of his creations, his children, had permeated the cult of Transhumanism. Gone were the days when people would step back and examine the implications of their innovations, unsure if their existence really brought something positive or fearful of potentially catastrophic outcomes. “Rubbish!”, they would say, “We've systematically considered the possibilities. An overwhelming majority of outcomes point to nothing short of Utopia. Your fears are irrational and rooted in the primitive 'fight or flight' mentality that kept your ancestors alive in a hostile world. Things are different now.”

They had lost touch with their humanity. With their full body prosthesis, they were nothing short of demigods with the outsized egos of the Olympians themselves. Technology had enabled them to become stronger, faster and smarter than any creature that had ever walked the earth before or since. The only thing that connected them to their previous fleshy existence was their brains, and even those had been so heavily augmented as to be unrecognizable. The Continuity would be the final step, the last barriers to true godhood had been overcome at long last.

Leading the charge was prophetically named Icarus Foundation. As the most profitable corporation in the world, with cash reserves an order of magnitude greater than that of the recently established global government, the Icarus Foundation had enough financial and political resources to shape the world as they pleased. They could have put it toward ending global hunger or curing disease or even to simply enrich themselves. Instead, they directed all their energy toward furthering the Transhumanist agenda. With it's board of directors populated entirely with prominent Transhumanists, it was easy to see why.

Their plan was simple. Offer everyone on Earth a chance to leave their bodies behind and join them in a world of ones and zeros. Free of charge, naturally. After years of research, they had developed a cost effective way to translate human consciousness into digital form. The subject would have a small device placed at the base of their skull designed to read record their brain waves as well as manipulate their senses, creating a virtual reality that existed only in their minds. In this virtual sandbox, they would be subjected to a wide variety of stimuli in order to create a kind of map of their brain. The map would be used to generate the necessary data structures and algorithms necessary to transfer a consciousness from a living brain to a machine.

Their initiative was wildly successful. With the world's population standing at roughly twenty billion people, the number of people willing to accept an offer at Nirvana and escape their life of poverty was staggering. In the end over ninety-eight percent of the world's population had their minds digitized and their bodies disposed of. After it's announcement, public sentiment was such that all opposition was quickly silenced under a hail of rocks.

However, I had misgivings about Icarus from the start. It's origins can be traced back to a Transhumanist terrorist organization known as Ascension. Without warning, Ascension staged a series of bombings around the world that killed hundreds outright with hundreds more dying of their wounds. Ascension's signature style was to coordinated their bombings with large scale attacks on the telecommunications infrastructure of emergency service providers to hamper their response. In the aftermath of each attack, a letter written in ASCII art would be sent to the authorities. The message was simple: Flesh is fear. Fear is weakness.

Ascension disappeared from the scene as abruptly as it had appeared. This coincided with the establishment of the philanthropic Icarus Foundation. Funded by an anonymous group of wealthy donors, the Icarus Foundation quickly became one of the world's leading aid organizations. Following that success, it began acquiring large numbers of technology companies, eventually transforming it into an economic juggernaut. It's motto: Fidens Sum Vere Licens. Without fear I am free.

Although not publicly known at the time, one of the first companies it acquired was run by former members of Ascension. Having been able to evade the global hunt for the perpetrators of the Ascension bombings, the perpetrators simply faded away. After its acquisition by the Icarus Foundation, it was integrated into the division responsible for developing the artificial intelligence the would greet the newly “ascended”. The A.I. would act to gently integrated people into a world where the five traditional senses had no meaning, an understandably jarring transition.

Everything went well for a time. Then Murphy's Law reared its ugly head. The most extreme members of Ascension came to believe that the only true freedom from weakness lay in chaos. Taking the concept of entropy to the extreme, they introduced a parasitic bug into the A.I. that greeted every newly digitized human being. The bug would gradually corrupt all the data that comprised a person's digital identity. Interactions between those on the outside and the “ascended” gradually became erratic, degenerating into bursts of random data until they finally ceased all together. Then one day, the all lights simply disappeared.

In the end, those who gave up their bodies to join The Continuity got what they wanted: freedom from weakness. It had only cost them their lives. As for me, well, those cries of “Amish scum!” I heard outside my window all those years ago turned out to be prophetic as well.

Sunday, April 17, 2011

Learned Today: Small Caps

So I'm sure most of you have seen a scientific paper where the section headings are bold and in all capital letters with the first letter of each word somehow larger than the rest. Today, I learned the technical term for this style: Small Caps or small capitals. I also learned how to apply them in Open Office. Highlight text -> Format -> Character -> Font Effects -> Effects -> Small capitals.

Friday, April 15, 2011

Power User or e-Slob?

Does this make me a power user or just an e-Slob? Note: there are no games running in the background.

Thursday, April 14, 2011

Technical Interview Fail

So I had my technical interview for a Fortune 500 company today. From the title I think it's obvious I fared quite poorly. The opening questions were typical "Tell me about the project" and "Compare your two most familiar programming languages. What do you like/dislike about each?" While I was under-prepared for these questions, it's trivial to contrive rehearsed answers for these types of questions. My most significant failure relates to the problem presented next.

It is a simple problem, one that any first year Computer Science student should be able to solve with ease, let alone a fourth year graduating senior.

Problem: Develop an algorithm for a function that takes an array of integers and an integer containing the size of that array. Remove all the duplicate values from the array. Return the size of the array. That is, the number of unique values in the array.

Constraints:
  1. You may not use use any additional data structures. Local variables are allowed.
  2. Valid data must be in a form that is accessible using only the value returned. That is, using the number of unique values in the array, it must be possible to access only the valid data within the array.
While the second constraint wasn't explicitly given, I felt it was implied since any integer is considered potentially valid data. This precludes filling slots with a "null" value.

Given this problem, I'm am ashamed to report that I panicked. I resorted to the embarrassingly obvious naive solution. Moreover, even for a naive solution, it was a particularly poor one.

That algorithm is as follows:
  1. Declare a variable to count the number of duplicate entries
  2. Starting with the first element in the array, compare it to each other element.
  3. If they are equal, then the second element is considered the duplicate.
  4. If a duplicate element is found, it is swapped with an element from the back of the array.
  5. Continue until the element you are comparing is a duplicate element. That is, the index of that element is equal to the size of the array minus number of swapped items minus one.
  6. Return the size of the array minus the number of duplicate items.
Given that algorithm I proceeded to implement it in code. Upon completion I read that code to my interviewer. He pointed out several errors I had made in coding. Most importantly, he told me that swapping values was pointless. Since duplicated data should be "removed" preserving it through swapping was a useless waste of time and resources. I acknowledged this point and modified my code to simply copy over duplicate data.

And then came my Great Fail. My algorithm is O(n^2) in the worst case, with n being the number of comparisons made between elements. The natural follow up question is, of course, "What can be done to improve this algorithm?" I was stunned. In the back of my mind I knew this question was coming. I would have asked it myself had our positions been reversed. I had no answer. Did I have an idea? Certainly. I thought of using a binary tree to reduce the number of comparisons necessary. A friend of mine later suggested using a hash table upon hearing this problem. Both of these ideas are foolish. Indeed, they clearly violate the first constraint. So I said nothing. Should I have said something? Talked through my thought process? Perhaps. Although in retrospect the only thing I would have done was suggest a binary tree to reduce the number of comparisons. Perhaps the interviewer would have let me talk and at the end said something to the effect of, "You're not allowed to use data structures." Perhaps he would have cut me off at the mere mention of the words "binary tree." In his place I certainly would have. Perhaps in talking I would have been inspired and come up with my current solution or something. We'll never know. Instead, silence permeated until finally the interviewer had had enough."Do you have any questions for me?" he asked. Still stunned by my inability to respond intelligently to the previous question, I gave a meek "No, not at the moment." "Thank you for your time," he said. And that was the end.

That was more than five hours ago. Now I've come up with another solution to this easy problem. Perhaps it doesn't work or my analysis is flawed. I don't care. I need to write this down.

To understand my solution it's necessary to know a little bit about heap sort. Heapsort is an "in-place" sorting algorithm. "In-place" means that it does not require additional memory in order to execute. This is key due to Constraint 1. Heapsort has a worst case run time of O(n log n). I won't go into the details of big O notation here. It is sufficient to say that O(n log n) is significantly better than my naive O(n^2) algorithm.

"Solution":
  1. Declare variables to count the number of duplicate items and the array position (index) of the last (furthest to the left) duplicate value. Initialize them to 0 and -1, respectively.
  2. Compare each element with its neighbor. If the first two elements are the same, the right one is considered the duplicate. The last duplicate is now at position 1.
  3. After this, continue comparing elements with their right neighbor.
  4. If a unique element is found, copy its data to the position of the last duplicate. If there is no last duplicate, do nothing. A unique element is one where it's right neighbor is does not have the same value.
  5. If a duplicate element is found, the current element is considered the duplicate and its index is stored as the last duplicate. Increment the count of the number of duplicate elements found.
  6. Continue this until all elements have been compared.
  7. Return the size of the array minus the number of duplicate items. The array from index 0 to index array size minus number of duplicates minus one contains sorted, unique values.
This solution is simple. Absurdly so. Is it correct? I don't have the energy to confirm its veracity. My preliminary analysis indicates that its run time is O(n log n). It matters little now. It took me five hours to come up with this solution. Utterly embarrassing. Even if it turns out this solution is deeply flawed or no better than my original, it is far better than having said nothing at all.