You work for me, Computer.

By Brandon Bloom

Computers Are Everywhere, You No Longer Have an Excuse

I work in an environment full of brilliant, technically minded people. Try as I might, sometimes I simply forget how little most people know about computers.

In IKEA, I saw a odd wire metal rack which mounted to the side of a desk and held a computer tower suspended off the ground. Why anyone would attach such an ugly thing to a desk is beyond me. What is wrong with the computer being four inches lower, on the ground?

Moments later, an young employee pointed to the tower rack and referred to it as a “CPU holder”. Never mind that she was actually trying to sell this silly thing to someone, she just called a car engine by the name of a fuel injector!

Microsoft Manages Interns Better Than Google

In the interest of full disclosure: I was not offered a full-time position at Google.

I was encouraged to apply again after earning a masters degree or after a few more years of experience. Please do not misconstrue this post as sour grapes; I mainly applied for a full-time conversion at Google to improve my Microsoft offer.

The title of this post is hyperbole. Both companies are full of brilliant engineers, managers, and HR staff. Both do an excellent job of managing interns; they really do care. However, my experiences at Microsoft made it impossible for me to choose Google. Both sets of experiences were overwhelmingly positive, but several individuals at Microsoft went that extra mile and won me over in a big way.

Early during my first internship at Microsoft, it was apparent that my skills were being misappropriated. I wasn’t having a whole lot of fun and my direct manager quickly identified the problem during one of our weekly 1-on-1 meetings. He radically adjusted the course of my internship to fit my skills and personality. Soon, things were going much better. I was more productive, having more fun, and learning a lot more.

Great experiences didn’t end with just my direct manager. Each intern also gets a mentor for additional job and team related support as well as a “coach” from a different part of the company to jump start your personal network. My mentor helped me understand the politics and personalities of our team and team members. My coach convinced me to tell my recruiter about the changes going on with my internship. My recruiter encouraged me to conduct informal interviews and have casual lunches with various people around the company. These “informationals” led to an awesome position for my second internship that was a perfect fit for me.

Internships at Microsoft are as much about growing people as they are about getting engineering done. There are a lot of people around to support you, so you receive a lot of support. If you are lucky — like I was — all of these people are exceptional. Even if you are unlucky, at least one of the your many network seeds may be helpful. Additionally, recruiters are more personally involved. Our recruiter took us out for dinner and drinks in small groups and got to know us quite well.

At Google, I once again felt as if my skills were being misappropriated, but enjoyed far less success in correcting it. For the majority of the internship, I only interacted with my direct manager. He lacked experience mentoring interns and was a little surprised when I raised concerns over the quality and pace of my own work. With no one else to turn to, I contacted my recruiter who was alarmed by my frustration. Although my manager and recruiter had the best of intentions, they choose to reassure me that I was doing good work rather than reposition me so that I could do outstanding work.

I am more vocal and assertive than many engineers, and especially interns, at both Microsoft and Google. Speaking up at Microsoft did wonders for my career, but I didn’t have to say a word for people to ask “what’s the matter?” At Google, I spoke up and I felt as though I was the first to ever do so — people were surprised and unsure how to react.

It should be stressed that these were personal experiences. Some other fresh college graduate may have a completely symmetric experience.

Imported Comments

michaelkimsal.com

Very interesting. MS is likely just a much more mature company with respect to human issues. Given that they’ve been around a lot longer, that’s not surprising. Also, given the competitiveness MS has, they may be more attuned to issues of people being dissatisfied than they were, say, 10 years ago or so.

By contrast, Google is a relatively new beast, having really only had a few years’ experience of having thousands of employees to manage. And after all, “it’s Google!”. For someone to suggest anything is “wrong” internally, culturally, probably isn’t something they’ve have to deal with much yet. When thousands of people leave over a short period of time, they’ll have to learn to deal with these issues at a cultural level – everyone will need to be aware of it, not just managers.

Good luck at MS!

Alexandru

i’ve been to google two summers already & my experience was close to perfect both times. i was given the freedom to choose which project to work on and i really felt that i was part of team and not just an intern. more than that, i don’t crave for too much attention.

i don’t care about social network, but at google we had social parties each friday where you drank wine & beer for free, meet (really!) new people, discuss company internals & stuff.

can’t say anything about microsoft, cause i’ve not been there, but good luck joining them.

Brandon Bloom

(Cross-comment from Hacker News)

“That’s why I ended the post by saying this was a personal experience. I just wanted to make the point that it certainly appeared like Microsoft was better prepared to cope with this situation and it also seemed that it was due to a more mature internship program. I loved working for both companies and there will definitely be pro-Google posts in the future :–)”

@michaelkimsal:

Google deals with people suggesting things are “wrong” internally CONSTANTLY. In fact, there are numerous occasions where it is highly encouraged and institutionalized. Unfortunately, I had no platform for my problem and only a very small number of people were able to see it. The response was adequate, but far from perfect.

Regarding social TGIF:

They are great! I did meet a lot of interesting people, but the context is primarily social. Discussions are less you & me -centric and more google & tech-world centric.

Rift

Rift is a top-down shooter with a time-stopping twist.

It was written in XNA by three Drexel students in three days:

  • Brandon Bloom – Lead Programmer
  • Lee Baker – Programmer and Sound
  • Charbel El-Beyrouty – Artist

Watch the video:

(low quality, no sound, obvious recording artifacts — I might have to buy Fraps)

NOTE: Game and source download has gone missing. Sorry!

A Googol at Google

Ok, well maybe not quite a Googol (1x10E100), but certainly 263-1 (9,223,372,036,854,775,807).

This quote was taken from talk by Steve Yegge:

We overflowed a long at Google once. Nobody thought that was possible, but it actually happened.

Working at Google has irreversibly damaged my ability to reason about numbers. One moment you are talking about petabytes of data and the next moment you are talking about nanoseconds. It is truly mind numbing, but it has become hard not to laugh when someone talks about a huuuuge dataset of a few gigabytes or a fast process at a few seconds.

It Is Easier to Pirate Windows Than to Install It Legally

My desktop machine is totally hosed. I don’t know why or what is wrong, but it is really bad. So I want to re-install Vista, but I can’t find the DVD or product key.

I currently own several Windows Vista licenses:

  • Gift for being a Vista beta tester
  • Drexel CS Department
  • From interning at Microsoft for a combined 9 months (and full-time this July)
  • Microsoft Dreamspark
  • Probably more

I expected to use my beta tester Ultimate license via logging into http://connect.microsoft.com/, downloading the DVD, and getting the product key from Connect as well. Unfortunately, I can’t find the product key or download anywhere on the new Connect. Additionally, I’m blocked from all MSDN subscriber downloads because, well, I’m not a subscriber. Strangely enough, I vividly remember being able to download Vista from MSDN at one point…

I’m looking high and low for a spot to download Windows Vista legally. It is laughably easy to download it illegally, and the download speed will be better than Microsoft’s servers anyway.

The next version of Windows should be freely available for download and the installer should ask for my Windows Live ID, not some cryptic 5x5 product key.

When Order Doesn’t Matter

Programming languages, imperative ones in particular, place a lot of emphasis on sequence. Take this simple pseuodocode for example:

components = [4.5, 3.2, 4.0, 3.8]
min(round_down(components))

The result of this code is 3, but that same result can very clearly be achieved in a less computationally expensive manner by performing:

round_down(min(components))

Correct me if I am wrong, but current day compilers operate on source code that lacks the semantic to perform such an optimization.

Pythagoras Was a Smart Guy

In a recent presentation by Epic’s Tim Sweeny, it was stated:

Factoid: C# exposes more than 10 integer-like data types, none of which are those defined by (Pythagoras, 500BC).

This got me thinking; why do I, a programmer, have to tell the computer exactly how much memory to allocate for a number? Why can’t I define a variable as a Whole Number or as a Real Number?

For one thing, the computer can not predict how large a number may grow or what level of precision is required. In theory, any integer can have an infinite size, but in practice 32- or 64-bit integers have proven acceptable in most instances. This is because generally integers used in software nearly always represents a very real, very finite count.

Currently, it is left as a task to the programmer to determine how big a number can grow, but why not leave it to the programmer to define semantics of a number and allow a computer to choose a representation size? Should a programmer choose to omit any such constraints, use an “infinite” precision variable by default and leave semantic definitions as an optimization exercise.

Imported Comments

hostilefork

Well…Pythagoras may have been smart, but rumor has it that the Pythagoreans killed the guy who proved the existence of irrational numbers! http://en.wikipedia.org/wiki/Hippasus

In any case, I agree with your idea that bignums should be the default in almost any programming system: http://en.wikipedia.org/wiki/Arbitrary-precision_arithmetic

This was driven home recently when a friend of mine was confused by the results of a floating point calculation. I had to explain IEEE 754, and was thinking that it really does seem like something the average scripter or excel user should not have to learn.

Would be even more fun if we could store all calculations symbolically, and have some kind of mathematical engine behind the scenes. The symbol engine would reduce expressions whenever it could (e.g. sin(pi) => 0), and would create string caches of the approximate values to whatever precision you wanted for display.

That might take a while. :)

Brandon Bloom

Threw him overboard? That’s an irrational response! (har har har, oh puns)

I’d love to see symbolic computations as the default for a mainstream programming language. Software is about abstraction, so it seems reasonable that the default behavior is as abstract and general as possible. If performance suffers, pervasive type inference should be capable of switching to IEEE standard doubles with a single type annotation.

File Systems for People

Using Blogger, I am able to create and edit blog posts from a set of simple lists without any concern for how the data is stored. There is no “save” button to be clicked in OneNote. Blogger and OneNote provide intelligent data handling for me. Yet, if I create a document in Microsoft Word, I am forced to provide a file name and save location if I want to store my document.

As a user, I can utilize a file system to organize my documents for me, but generally, I just don’t care. As long as all of my files are grouped by project (read: one level deep) I can find practically anything I need. N-tier grouping is overkill for nearly all typical user scenarios.

File systems define data relationships by hierarchical locality. If two files are in the same folder, they are probably related, but there is no guarantee of that. Data has a lot of inherent relationships that goes unexploited by current day software. Source documents such as code and Photoshop files are in no way linked to their output documents such as executables and web graphics. I might have to click three dozen times to find the file for the image that is embedded in an a Word document or I may never find it (it could have been deleted).

We need a directed graph having files as nodes and relationships as edges. These relationships need to be automatically inferred whenever possible. Relationships should be able to span from RAM to hard disks, from one hard disk to another, across the internet, and everywhere in between.

Data collections need to be easily searchable. Querying and filtering data should not require construction of a database; tables and indexes should be created automatically to accommodate data access patterns.

As much as possible, the work of managing data needs to be offloaded to computers so that humans can be left with the task of creating and utilizing data.

Computers Should Make Life Easier

If you find yourself sitting at your computer doing the same thing over and over, you are doing it wrong.

Computers can perform repetative tasks with blazing speed and percision without getting tired or bored. Your computer won’t complain if you ask it to add all of the numbers between 1 and a billion, nor will it accidentally skip 10,956,234.

Knowing this to be true, why do I often see even expert computer users performing the same tasks a dosen times in a row? Why do programmers constantly write the same boiler plate code again and again? Why are companies constantly re-inventing the wheel?

Computers are no longer the big fancy calculators of the industrial age, they are the hammer and nails of the information age. So why is all modern software written to run on platforms designed for big fancy calculators?

Personal computing needs to be reinvented for the information age from the ground up.