tag:blogger.com,1999:blog-3237479500292933022024-03-12T20:52:17.885-07:00Midnight Sodapop TechnologyMusings about technology, written under the heavy influence of 23 flavors at midnight.Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.comBlogger17125tag:blogger.com,1999:blog-323747950029293302.post-64325457411310055052010-02-21T01:35:00.000-08:002010-02-21T02:15:34.618-08:00Why doesn't it work?<p>I love Windows. I'll make no attempt to hide it, I am a Windows user, through-and-through, now and forever. I've used Windows since I started, and though I've used both MacOS and Linux extensively, at the end of the day, it's all about windows. Windows' match of simplicity and power just cannot be beat, and it's status as the majority platform of choice means the availability of software is unparalleled.</p><p>Some days, however, I work with Windows, and just wonder. Today has been one of those days. Today, I installed Windows 7.</p>
<p>The poor design started before the installation began. I owned a copy of Vista Ultimate. I like Ultimate, but this time, my financial situation is a bit different, and Professional made a bit more sense to me when it came to buying Windows 7. So, I bought an upgrade, and popped the disk in. It let me go about halfway through the pre-upgrade process before saying "Hey wait a minute. This is Ultimate. You can't upgrade from Ultimate to Professional!". There are two failures here:</p><ol><li>The most obvious one: while I realize the upgrade may change a few things in the system, I SHOULD be able to go from Ultimate to Professional. It should NOT be an issue. Come on guys, this just gives undo credit to all the critics who hate the diverse product offering.</li><li>That non-withstanding - why did you allow me to waste a good 15 minutes in pre-update before telling me this?</li></ol><p>Ultimately, I am forced to re-built from scratch, leaving me with a lot more work than I'd like. So I begin this process, after booting down my loyal Windows Vista install with a respectful salute, and the install goes smoothly till I hit the verification stage. I had printed my key out beforehand and proceed to type it in.</p><p>Fail.</p><p>Ok, well, I've probably mis-typed it, let's go again....</p><p>Fail</p><p>Drat...one more time....</p><p>Fail</p><p>Ok......I compared the two.....They were identical! Crap. Something's wrong. After an hour of frantic digging, I finally found my original email only to find that the key I'd printed was in fact....the key for my beta copy! My bad. Ok...type it in and.....its good! Phew. Lesson learned: Activation still sucks. I'm not sure how this can be solved, but I'd sure as heck like it to be.</p><p>Now, finally, I boot up. No video or sound drivers are loaded, so I'm not gonna get much here. I pop in my bootcamp cd, which after several errors and warning messages about incompatibilities, finishes the install. I restart, and proceed to Windows update to get some video/audio drivers, and restart. Ok. Now - I realize Microsoft can't really address this problem. My hardware is not a "typical example" (being Apple), and Microsoft doesn't control the hardware platform like another fruit company I know, so cute startup videos are probably a non-starter, but I'd like to see a setup wizard that does a windows update routine before showing the user the desktop. Just seems better to me.</p><p>Next up, the Audio driver isn't working. Turns out this is because Apple has stopped releasing fixes for their 10.5 line of bootcamp installers, and if I want the latest bootcamp stuff, I've got to buy Snow Leopard. No thanks. This serves as a great demonstration of the difference between Apple and Microsoft. Microsoft ALLOWS others to break their software. With Apple, everything is so tightly controlled that the only one who ever breaks Apple's software is Apple themselves. They achieve this by creating very heavily regulated API's, tightly monitoring distribution, and making the development options so heavily pattern-oriented that nobody can get any serious coding in it except for Apple. It's REALLY easy to have a streamlined UE if you control all the hardware and software. When it came time to be on equal footing with everyone else in the real software world - Apple's product turns out to be no better than anyone else's - buggy and rough at the edges.</p><p>Microsoft stands to improve their situation here. A few more standards and a bit tighter control would annoy a lot of software guys out there, but probably work out better in the long run. Sure, eliminating/emulating the ActiveX framework would piss some people off, but it would probably be better in the long run. Microsoft needs to learn that they have to stop catering so much to the archivists - those who need to run 20 year old software. They are too small a market sector to keep supporting, especially at such a large cost to the majority of users. Vista was actually a step in this direction, and a good one - for which it got loads of undeserved bad press. The driver model FINALLY changed, security was finally taken seriously, the UE was updated....lots of better ideas. But Microsoft has a long way to go.....</p><p>Anyways - Windows 7 is finally working for me (though my Mac's CD reader isn't cooperating at the moment - again...dumb Apple machine), and I'm writing this from my new OS. The hard part is over, and now I get to enjoy my lovely windows again.</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-6276043574619841822009-10-15T20:49:00.000-07:002010-02-21T02:21:55.298-08:00Update: Google Voice RulesThey did it! Google Voice is now impressive, featuring integrated Google contacts, transcribed (impressive!) voice-mail in my in-box, SMS, greeting selection by contact, etc...all-in-all, a GREAT product, and one I am using extensively now.
In fact, if you like, you can ask me more about it by clicking the call me link on this blog page!
UPDATE: And they bought Gizmo! PLEASE do something cool with this, QUICKLY!Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-68260805055018279812009-07-09T21:58:00.000-07:002009-07-09T22:20:08.756-07:00Geocities is closing<p>A sad day this is for the internet. The original host-your-own super shitty site, the host of the original hamsterdance, and the butt of oh-so-many jokes - Yahoo's Geocities hosting community, is closing it's doors on October 26th, 2009.</p>
<p>I, like so many others, got my start on Geocities. And while my original website is long gone, I still have content there (don't go looking for it please - it only has a 4 mb/hour bandwidth limit)...which I had forgotten about for nearly ten years.</p>
<p>TEN YEARS! That's a <span style="font-weight: bold;">long</span> time when it comes to the internet. How many of you can remember the web ten years ago. It was a totally different ballgame back then, wasn't it?</p>
<p>I remember the days of Yahoo's first portal (which amazingly, wasn't any less stupid than their current one), the excite search engine, pages with embedded wav files, horrible, table based webdesign - that wasn't so much design as 'how many cute clipart pictures can we add to a site'. I remember frontpage 2000, the early versions of dreamweaver....</p>
<p>Look how far we have come!</p>
<p>Geocities was the first, and now it is no more. It is a sad day for the internet, and the end of a historic era.</p>
<p style="text-align:center;font-family:cursive, script, serif;">R.I.P. Geocities
<br />
December 15, 1995 - October 26, 2009</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.comtag:blogger.com,1999:blog-323747950029293302.post-2418709314899594742009-02-06T00:52:00.000-08:002009-02-06T02:30:38.009-08:00Bil Conference<p>Check out the Bil Conference....I will be there!</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.comtag:blogger.com,1999:blog-323747950029293302.post-80704437102167442192008-07-09T22:30:00.000-07:002008-07-09T22:32:54.033-07:00Really? Why?<p><a href="http://arstechnica.com/reviews/os/open-moko-software.media/om2term.png">http://arstechnica.com/reviews/os/open-moko-software.media/om2term.png</a></p><p>I don't even have to write about that. That is a self-explanitory fail.</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.comtag:blogger.com,1999:blog-323747950029293302.post-54333190654505582432008-07-04T11:44:00.000-07:002008-07-04T11:58:12.226-07:00Grand Central + Google = ???<p>So Google bought Grand Central. A long time ago in fact. Cool.</p><p>Why, then, have they yet to do anything cool with it?</p><p>They seem to be missing everything, like contact book synchronization, spam filter usage, google account integration...but by far the greatest miss yet by google is something that the world would be screaming to have: <span style="font-weight: bold;">voicemail in your email inbox</span>.</p><p>Think about it. Google + GrandCentral = Voicemail in your Gmail box. How awesome would that be? Simply go to gmail, and, if someone sends you a phone message, you click it, the line then triples in size to reveal a player control, and plays your message. If you are viewing the thing with Gmail mobile, a different thing happens. Voicemail messages are not necessarily included in the POP/IMAP feeds, but if they are, they are included as a regular old email message with the phone number without an email address set as the sender, and basically have a standard template of text with the mp3/ogg vorbis (one can dream) file set as an attachment, and maybe an ActiveX/Flash/embed player set in the page so that intelligent email clients could play it. Apple could make an updated firmware that would allow it to integrate with the phone on the iPhone (Apple will actually listen to Google I suspect - at least they will if they are smart).</p><p>So how has Google missed this? These guys are smart? With all the crazy buzz that Apple's visual voicemail has gotten I cannot imagine that they have failed to see this! This would beat visual voicemail by a mile. This MUST be the only reason they haven't - because of Apple. Maybe they have some deal which precludes doing this.
</p><p>Google - if you are reading this - I'd really love to know why this hasn't been done...and believe me, if it isn't good, there will be another post on this blog which declares you all idiots. 'We didn't think about it' is not a good reason...'we didn't think about it but we are going to do it as a top priority' is a bit better.
</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.comtag:blogger.com,1999:blog-323747950029293302.post-62166022763234759412008-06-23T10:00:00.001-07:002008-06-23T10:02:47.147-07:00Please fix live search<p>I dislike live search, mostly because of the way the results pages look. But I have a three liner fix that could make live search so much better.</p> <p>First look at the default output for live search:</p> <p><img style="border-top-width: 0px; border-left-width: 0px; border-bottom-width: 0px; border-right-width: 0px" border="0" alt="image" src="http://lh4.ggpht.com/abstractapproach/SF_Wv6sB0FI/AAAAAAAABGU/-UlCSuiWiN8/image_thumb.png?imgmax=800" width="244" height="128" /></p> <p>The text is all centered. The problem with this is that the margin against which it is centered is a false margin. It is imaginary. This confuses my mind. I look at this page and it makes very little sense. It is hard to read. I cannot go from link to link easily, and the link hierarchy is lost.</p> <p>Let's look at my favorite search engine, Google, and how they display results:</p> <p><a href="http://lh6.ggpht.com/abstractapproach/SF_WwHy680I/AAAAAAAABGY/JR9S_0Z54CQ/s1600-h/image%5B5%5D.png"><img style="border-top-width: 0px; border-left-width: 0px; border-bottom-width: 0px; border-right-width: 0px" border="0" alt="image" src="http://lh4.ggpht.com/abstractapproach/SF_Ww7KVS_I/AAAAAAAABGc/_h5igEJKJks/image_thumb%5B1%5D.png?imgmax=800" width="244" height="128" /></a></p> <p>Why is this better? It is better because the text margin is real - the side of my screen. It is easy to go from link to link, and hierarchies work fine in this display.</p> <p>So how can I fix live search? A simple three liner will do it. I'm not sure how your site is laid out since you seem to be including the style in the page (bad Microsoft), and using a newline stripper on top of that (I hope it isn't like the one I wrote once, not because it is special or I want to sell it, but because mine was really slow and un-optimized). But wherever your style page is, remove:</p> <p><code>#sb_width <br />{ <br />    margin: 0pt auto; <br />    max-width: 990px; <br />    _width: 990px; <br />} </code></p> <p>That's it! For completeness also remove the <code>text-align:center</code> on <code>#sb_page</code>, just for older versions of Internet Explorer. The end result will look like this (thank you Firebug!):</p> <p><a href="http://lh3.ggpht.com/abstractapproach/SF_WxfvSoXI/AAAAAAAABGg/T3bqQihkmnE/s1600-h/image%5B8%5D.png"><img style="border-top-width: 0px; border-left-width: 0px; border-bottom-width: 0px; border-right-width: 0px" border="0" alt="image" src="http://lh5.ggpht.com/abstractapproach/SF_WyCZhrzI/AAAAAAAABGk/qj7lRLPDKXI/image_thumb%5B2%5D.png?imgmax=800" width="244" height="128" /></a></p> <p>So much better! I might actually consider trying live search if it looked like this. If anyone actually knows greasemonkey, and wants to write a script to do this, please do, and I will link to it!</p> <p>Notice: My suggestion may be used at any time, without credit, without any form of compensation, and without notice, or anything else. You can use it freely Microsoft. Really. Please do.</p>
<p>
<strong>[NOTE]</strong>I am aware of how lame the images are. I will replace them soon with higher quality versions...Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-30449771811727469622008-06-14T18:17:00.001-07:002008-06-14T18:17:15.917-07:00Why Cellphones on Airplanes Suck<p>I am never against progress. Or almost never. Today I discovered the first technology I don't really agree with - microcells on airplanes.</p> <p>In fact I am writing this on an airplane, using Windows Live Writer, as I am not connected to the internet. I am writing this after the fifth time I thought of looking something up in my iPhone online and realized I couldn't.</p> <p>Now I'm not one of those academic/apocalyptic freaks who say we should not rely on the internet for information. On the contrary, I'm building a company based on the idea that we should - and it should be available, context relevant, in the palm of our hands, on a nifty device. So why should I be against connectivity on airplanes?</p> <p>The reason is fairly simple. I am against cell in planes because I believe that, no matter how powerful and amazing internet devices are, we all need to spend some time disconnected. For example, I don't currently know what the traffic in phoenix is like. And that is a good thing. I can find out when I get there. I don't have the information I wanted to lookup on my phone. Fine. I will have it later.</p> <p>As a society we have become obsessed with instantaneousness. Instantaneousness rocks, I won't disagree with that. But we don't need it 24/7. We don't need it at all. In fact, it is bad for us. Instantaneousness causes stress, emotional problems, etc.</p> <p>In short, I love that I can look up information at a press of a button, and be able to find anything I like with amazing ease. I love it. But I also like the fact that, occasionally, I am disconnected from it. Nobody can call me now. That is OK. I cannot email or text message anyone. Fine. I'm not one of those peace people either - this is not relaxing. I'm in a 737 with 6 seats to a row. Relaxation is unlikely.</p> <p>I do not feel any need to be disconnected. Rather, I think the opposite way - I do not feel any need to be connected, and I pity those who do. Think about it - if you cannot afford to be disconnected for a two hour flight, you are a loser. There is no good reason to be that connected. Period. If anyone can find a reason, please, post it in my comments, and, if it really is a good reason, I'll edit this post. But as far as I know, two hours is not a very long period of time, and there is no good reason why you have to talk to your boss/coworkers/employees/friends/relatives/anyone during that time.</p> <p>I used to think microcells in planes were the next awesome thing. Any maybe they are still not to bad for a 16 hour trans-pacific flight. Fair enough. But for a 2 hour hop, it is a fail.</p> Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-91187775609221675902008-06-06T00:28:00.000-07:002008-06-06T00:36:20.316-07:00I'm going to WWDC......wearing a Microsoft shirt. Or at least I wanted to, though I've decided it would be rather poor judgment - especially while trying to present some new iPhone based software.
Instead, I will be there in a sStitch shirt, talking it up, watching the event, and just generally hanging out. I will have my iPhone, and my tablet PC (speaking of which - when is Apple going to FINALLY make one?) with me, so I may blog a bit, but I can't promise anything. As if anyone really cares right now anyways, but I do like pretending they do.Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-19187055270020449892008-06-03T15:02:00.001-07:002008-06-03T15:39:44.735-07:00Dark Color Scheme...<p>After reading <a href="http://www.hanselman.com/blog/CommentView.aspx?guid=038f7325-ba8e-46d1-a1ad-ecc186167de8" target="_blank">a</a> <a href="http://weblogs.asp.net/infinitiesloop/archive/2006/08/06/Join-the-Dark-Side-of-Visual-Studio.aspx" target="_blank">number</a> <a href="http://www.winterdom.com/weblog/CategoryView,category,VS%2BColor%2BScheme.aspx" target="_blank">of</a> <a href="http://www.hanselman.com/blog/VisualStudioProgrammerThemesGallery.aspx" target="_blank">articles</a> about color schemes, I decided I just <strong>had</strong> to try this out.</p> <p>Normally I am a white background, black text guy. Simple enough, it works everywhere, but apparently, for coding, inverted color schemes are better. I have been adverse to these schemes ever since my first web design efforts went live using massively inverted schemes and...god help me...comic sans fonts. So I've learned to keep it simple and natural.</p> <p>Fast forward to now, and, after some reading, I'm giving in a trying a custom made natural inverted color scheme in visual studio. Here is a sample of what I'm doing:</p> <p><a href="http://lh5.ggpht.com/abstractapproach/SEW_ejaJG5I/AAAAAAAABF0/Hb-dAX3_ZK4/s1600-h/image%5B5%5D.png"><img style="border-top-width: 0px; border-left-width: 0px; border-bottom-width: 0px; border-right-width: 0px" border="0" alt="image" src="http://lh6.ggpht.com/abstractapproach/SEW_fn6arNI/AAAAAAAABF4/tYWEicudWeI/image_thumb%5B1%5D.png?imgmax=800" width="244" height="145" /></a></p> <p>(I took this using <a title="Jing - An awesome screen capture utility!" href="http://www.jingproject.com/" target="_blank">Jing</a>, which rocks!)</p> <p>A few quick details:</p> <ul> <li>The font is FangSong, a mono spaced font of unknown origin. I could not find any information on it online, but I'm sure it came with one of my pieces of software - I did not download it. </li> <li>The font size is 11. FangSong is rather small, and so I needed to make it a bit bigger. </li> <li>The background is a custom created natural earth-tone brown, slightly heavier in red than green, with very low saturation and high luminance. </li> <li>The colors of elements are also mostly custom. The white text is not true white but 232/240. </li> <li>Keywords are a deep cyan color: low luminance and high saturation. </li> <li>User Types are a more aqua style color - turquoise, again low luminance and high saturation. </li> <li>Brackets and other raw-text things are a slightly modified (darkened) yellow. </li> <li>XML comments are drastically different from the rest of the scheme. This is so that, while coding, I can quickly determine what is code and what is comment, without needing to shrink it (though I usually shrink it anyways). </li> <li>I haven't finished the rest, so I will update this scheme as I run into stuff I don't like. </li> </ul> <p>I've uploaded my scheme to my SkyDrive, so you can all try this out and see what you think...I'm not sure what I think yet, as I haven't worked much with it. I take confidence, however, in the fact that Visual Studio can restore its default settings with only a few clicks. As I exclusively use Visual Studio 2008 now, thanks to multi-targeting, this file will not work with 2005 (at least I doubt it will).</p> <p><a title="Matt's Natural Dark Scheme for Visual Studio 2008" href="http://jpesng.blu.livefilestore.com/y1pJNq97TCY90AVQTJAh0EikTkjg4RXu576kZn6H9XBoAKoeaE-Q4TNYWn1VhY91bpHctYF7KqD1pcF7rGiT31QzcfB29VMXBI0/Dark%20Color%20Scheme.vssettings?download">Natural Dark Scheme</a></p> <p>Let me know what you think!</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-32189275110435849992008-06-02T00:24:00.001-07:002008-06-23T10:16:56.603-07:00Html Time<p><strong>UPDATE - It appears that this IS actually in HTML 5 (and was before this post - I am not the instigator of this)! Sweet! Finally something well engineered out of the W3C!</strong></p>
<p class="code">
<code><Time /></code>
</p>
<p>
Why not? Has anyone given this idea any thought before?
</p>
<p>
Here is the problem. I have a blog (in fact, this is true...I'm writing in it now...how...exocentric...). Each of my posts is tagged with a time. This time means something to me, since I set it in Blogger to be Mountain Standard Time sans Daylight Savings. But it doesn't mean much to the user, unless I
</p>
<ol>
<li>Tell them that I used that timezone somewhere, like the sidebar<br/>-or-</li>
<li>Post all times in a format which includes the timezone, like ISO 8601 (2008-06-02T00:20:31.52-07:00 anyone?)</li>
</ol>
<p>
Obviously the first solution is lame, since I now have to explain something I otherwise should not have to, while the second solution is just not readable by humans (3l33t h4xorz may technically be humans, but, while they can probably read this without effort, they are too busy watching Star Trek, and so they do not count). That leads me to one solution. Since I clearly do not want to explain to my users what time zone it is in, the simple solution is to automatically put it in the correct time zone for users.
</p>
<p>
There are three solutions here, and I will order them in order of how lame they are.
</p>
<p style="font-weight: bold;">Solution 1: Server Side Script</p>
<p>
Build a server side engine to render your time based on the user's time preferences. The problem is, you have to have a user in the first place. I have <del>thousands of</del> many people visiting this blog who are not blogger users, and don't have a Google account. What about them? I could try to detect the time using javascript, but then I would be better off with...
<p>
<p style="font-weight: bold;">
Solution 2: Client Side Script
</p>
<p>
Using Javascript, I can render a date object as a string, and retrieve the timezone offset data I need. The two functions of interest are <code>Date.toLocaleString()</code> and <code>Date.getTimezoneOffset()</code>. The former is simpler and easier to use, while the latter allows for complete control over the outputting format (since it outputs only the offset, one could easily then apply the returned offset to the origional value and print the result out manually using the properties as desired).
</p>
<p>
The problem with this method is it uses Javascript. I do not mind using javascript. This particular issue, however, is
</p>
<ol>
<li>Fairly simple<br />-and-</li>
<li>Fairly consistant</li>
</ol>
<p>
This makes it a perfect candidate for encapsulation. This leads me to:
</p>
<p style="font-weight: bold;">
Solution 3: The DateTime Tag
</p>
<p>
Introduce a new HTML tag in version 5 called DateTime. DateTime is, well, a date and a time. By defualt, Time could be formatted two ways, verbose and compact. The basic form might be:
</p>
<p class="code">
<code>
<datetime>
<br/>
<day>2</day>
<br />
<month>6</month>
<br />
<year>2008</year>
<br />
<hour>0</hour>
<br />
<minute>51</minute>
<br />
<second>2</second>
<br />
<timezoneoffset daylightsavings="false">-7</timezoneoffset>
<br />
</datetime>
</code>
</p>
<p>
Obviously, this is quite verbose, but it is also very readable and simple. An alternative form could be:
</p>
<p class="code">
<code>
<datetime>2008-06-02T00:51:02.00-07:00</datetime>
</code>
</p>
<p>
This form would be perfect for machine generation and reading, but is less readable. Clients could probably easily support both, however, especially the later, since ISO 8601 is already the standard for XML.
</p>
<p>
This would also have several other benifits besides time-zone relevance.
</p>
<p>
First, by expressly defining a data-structure, we allow user agents to immediately identify it. Where you can quickly and easily provide advanced functionality with images, you can provide similar, and equally cool, functionality with dates (like 'Create New Appointment at this Time').
</p>
<p>
Search Engines can index pages better, since they provide automatic temporal metadata. Search improves - a page which lists only one date tells the engine a lot. Say that date was from 1995, and the page is clearly about programming. What are the chances that this is a modern article, versus some Unix geek's first website?
</p>
<p>
Finally, and perhaps most importantly, users can choose how best to represent time. Lots of us have personal preferences for time display (I personally like DOTW DD, MM/YY - HH:MM:SS), and we could each set our respective browsers to display it as we like. Heck, we could even come up with really cool custom displays, like analog clock images, or binary, or....you know where this leads.
</p>
<p>
Date and time are a very simple datatype, and should be treated as such. We have a structure for lists, specifically so we don't have to make a bunch of custom formatted spans and put manual bullets up front. We have headers, paragraphs, etc. Why not times? And why stop here? Why not also add other data types as well? Addresses would be nice (think automatic maps), any other good ideas?
</p>
<p>
As I don't have the time, and the W3C does not have the ears to listen, I'm not going to submit this suggestion. But, should anyone who the W3C might actually listen to find it interesting enough to submit, feel free to do so. Let me know, and I will be happy to support it and talk about it more, so long as I am not speaking to a deaf wind.
</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-31312378974038271222008-06-01T13:59:00.000-07:002008-06-02T01:27:41.621-07:00Why the 7 inch computer is a fail<p>
In 2005, at WinHEC, Bill Gates decided he was tired of playing second piper to St. Jobs in marketing, and tried to pull something of his own out of his pocket. But he missed something rather big: it didn't fit in his pocket.
</p>
<p>
The 7-10 inch formfactor is a fail in the consumer market. That's not to say that it shouldn't exist, or that there is not a market for it. Just that it is not quite as cool as people want you to believe it is.
</p>
<p>
A 7 inch PC is a very niche market. 7 inch does not fit in your pocket (if it does, either lose some weight or get slightly less baggy pants). It also is not really usable as a machine. I have personally worked on a machine like that, and I cannot stand it for more than about five minutes. The screen is too small to display anything interesting, and the machines are too weak to do any real work.
</p>
<p>
That said, they exist, and, as a believer of strict free-market capitalism, I believe that they do not exist without purpose. There are many special applications in the world which could benefit from this device. I saw a demonstration at MIX where BMW did a rich internet application interoperation with their dealers, and had their dealers carrying ultra portable Vista machines running a neat WPF 3D app that integrated with the web. Very cool. In this case, it is small enough to carry short distances (inside the dealership) but big enough to show clients. Where a laptop (even a 12 inch tablet) would be too large an unwieldy, and a PDA/Smart Phone would be too small to show, a 7 inch PC is the perfect compromise. Another great example might be a field technician, who uses a touch supportive device as his uplink to base, once again merging ease of quick carry with display size and ease of interaction.
</p>
<p>
But as a consumer devices, these 7 inch machines cannot be of much use. Sure, you could hook one up to a desktop monitor and keyboard, and thus use it as a tower, but you would be paying $400 for a machine that does not outperform a 3 year old budget tower (which are worth about $20 on ebay). These things are (usually) not even capable of running Vista, let alone Aero, leaving you stuck with a dated OS, no processor power for applications, little memory, and little else. If you're going to buy a micro-desktop, buy a Mac Mini (or Dell
s version, whenever they get it through their thick heads that they should build one and sell it in the US). If you want a cheap PC, go to Wallmart. But why by an EeePC?
</p>
<p>
Sure, like business users, there are some people who could use a 7 inch machine, often for the same reason. Perhaps as a home automation controller (until it gets lost in your kids' room), a car PC (can it dock?), a fancy portable DVD player (with a lousy screen), or even as an office organizer device (to show off how hip and well connected you are). These are small markets. But you can't carry them like an iPhone and you can't use them as a laptop, which are the two major markets. Not because of some grand Microsoft/government/soviet/alien/religious conspiracy, but because of the simple fact that it doesn't physically fit in your pocket.
</p>
<p style="font-weight: bold; font-family: sans-serif; font-size: 130%;">FAIL</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-12754588410725945422008-05-31T18:01:00.000-07:002008-06-02T01:28:26.314-07:00A really excellent post chain, and two new RSS feeds.<p>
I'm not going to describe what is in these posts, as they are self descriptive. Suffice to say that anyone reading this should immediately instead read the following three posts, in this order:
</p>
<ul><li><a href="http://www.hanselman.com/blog/ProfessionalismProgrammingAndPunditryAndSuccessAsAMetric.aspx">Professionalism, Programming, and Punditry and Success as a Metric</a></li><li><a href="http://girtby.net/archives/2008/5/22/blogging-horror">Blogging Horror</a><a></a></li><li><a href="http://www.codinghorror.com/blog/archives/001124.html">Strong Opinions, Weakly Held</a></li></ul>
<p>
All three are excellent, very well written. I have added Girtby.net and Coding Horror both to my RSS Feed list.
</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-38294597100066238582008-05-26T23:12:00.000-07:002008-06-02T01:37:34.348-07:00Using null in programming<p>
In C (and most derivatives), Java, SQL, etc it is called null. In Objective-C, Ruby, and Pascal, it is nil. But regardless, it means the same thing:
</p>
<p>
Nothingness - the absence of anything - utter non-existence.
</p>
<p>
Fredrik Normén wrote <a href="http://weblogs.asp.net/fredriknormen/archive/2008/05/22/avoid-returning-quot-null-quot-and-use-the-null-object-pattern.aspx">a post</a> questioning whether null objects should be returned from methods. His post brings to light a flawed coding principal - the thought that a method should avoid returning null due to null reference exceptions, and having to check if a value is null after the method. I must say that the argument for avoiding null is baseless and absurd, but, like many other programming practices out there, has its roots in billions of lines of poorly written code in which null is used inappropriately.
</p>
<p>
Programs should feel free to return null, as it represents a state of nothingness, a non-result. I will consider three examples, built around a fictional music player program. First, I have a method which looks for songs of a given genre and returns them:
</p>
<p>
(note - the vast majority of code samples on this site will be in a C#/Ruby-like psuedo-code. They are not meant to be real code, but are more than likely very close, as it is actually easier that way)
</p>
<p class="code">
<code>
public Song[] GetSongsByGenre(string genre)<br />
{<br />
// Get songs by genre and return them.<br />
}
</code>
</p>
<p>
This method should never return null. The result of a search operation is never null, it is never nothing. It is always at least an empty collection - a value which has a point beyond just being nothing. It can be searched against, queried, iterated, etc. Null would not be a good thing to return here, and almost never is for methods returning collections.
</p>
<p>
But what about methods which return single objects?
</p>
<p class="code">
<code>
public class Song<br />
{<br />
public int PlayCount()<br />
{<br />
// return the number of times this song has been played.<br />
}<br />
}
</code>
</p>
<p>
Here is another method which should not return null (ignoring, for the moment that it cannot in most languages as int is a value type). Returning null is an indication of 'complete lack of anything', and this method cannot possibly legitimately return that. If it cannot legitimately execute for some unknown reason, the proper response is to throw an exception. This might return zero, but zero exists (unless you are still working in Roman numeral systems, in which case, I wish you good luck with zero-based arrays), and so it should not be returned as null.
</p>
<p>
Now say I have this last bit of code:
</p>
<p class="code">
<code>
public class User<br />
{<br />
public Song GetFavoriteSong()<br />
{<br />
// Returns the user's favorite song<br />
}<br />
}<br />
<br />
public class UserInterface<br />
{<br />
protected void OnPlayFavoriteHotkeyPressed()<br />
{<br />
currentUser.GetFavoriteSong().Play();<br />
}<br />
}<br />
</code>
</p>
<p>
It is this type of code which typically brings rise to the argument against using null. User.GetFavoriteSong is a method that should be able to return null. If the user has no favorite song, than the proper value to return is 'nothing' or 'non-existence'. But in this case, the end result would be a null reference exception, since Play() cannot be called on null.
</p>
<p>
The proper pattern here is to do a check for null:
</p>
<p class="code">
<code>
protected void OnPlayFavoriteHotkeyPressed()<br />
{<br />
Song favoriteSong = currentUser.GetFavoriteSong() ?? AskUserForFav(currentUser);<br />
<br />
if(favoiteSong != null)<br />
{<br />
return;<br />
}<br />
<br />
favoriteSong.Play();<br />
}
</code>
</p>
<p>
Why is this code, which is more complex than the previous code by a lot, better? Complexity is bad, right?
</p>
<p>
In this case, it is not. In this case, the code actually <strong>needs</strong> to make a decision, whether the coder thinks decision making is acceptable or not (if not, he probably voted for John Kerry). The code can be more simply explained as:
</p>
<blockquote>Get the user's favorite song. If it doesn't exist, ask them if they want to choose one. If they still do not choose one, just stop trying. Otherwise play the favorite song.
</blockquote>
<p>
There really is no alternative to this. No matter what you do, your program has to deal with whether or not the user has a favorite song.
</p>
<p>
Some would argue that by avoiding null, this can be solved in one place, the definition point, instead of collectively when the result is used. This rarely works, and this case is a fine example of that.
</p>
<p>
In this case, we have a situation where the code <strong>must</strong> use the object, and in order to use it, the code <strong>must</strong> know that it exists (or else be vulnerable to null reference exceptions), which means that the code has to either except its existence as guaranteed, or perform validation. In this case, if I tried to use some other pattern, I would still be forced to return <strong>something</strong> to satisfy this, and nothing cannot be converted to something in this case without user interference.
</p>
<p>
I could throw an exception, but then I'm forced to either allow the exception to bubble up and explode or catch it somewhere and do something about it, which really then becomes another, less obvious form of validation (don't get me wrong on exceptions though - there are <strong>many</strong> great places where this kind of pattern rules...this just isn't one of them).
</p>
<p>
I could return some type of object (like double.NaN) that would signify null, but that doesn't do any better, as I still have to check for it.
</p>
<p>
The bottom line is that null is actually a perfectly valid and highly useful pattern, when applied in the proper matter. Null is like every other programming pattern - an abstraction. Good programming recognizes that code abstractions should maintain a one-to-one relationship with real world things. Null is a clear example of this. Null represents nothing, and developers can and should use it when nothing is called for, and only when nothing is called for.
</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-37314959230915102342008-05-24T03:18:00.000-07:002008-06-02T01:55:49.785-07:00Of Video Game Violence and Reality<p>
Many people today believe that video game violence is out of hand. This believe seems especially popular among US politicians, since it is a cheap way to appear to be a family activist - the losing side of the issue (video gamers) are by a vast majority, underage, and therefore, not voters. Unlike most issues, this is an issue where there is a clear solution that doesn't damage them at the polls, even though it doesn't take into account personal liberty, freedom of speech, etc.
</p>
<p>
Yet, while I do not agree that politicians or the government should solve the problem (that would be censorship, and I am systematically opposed to all censorship), I do not deny the existence of a problem. Video gaming is clearly causing problems in America, but in a different way that most people might think.
</p>
<p>
The classic anti-video game argument is what I will call the Columbine argument.
</p>
<blockquote>Video games make violence more real, and therefore more acceptable to people. Video games teach otherwise healthy children to be violent by such portrayal and realism, in the same way that classical media (standard television) has been proven to do.</blockquote>
<p>
I argue that this point is in fact baseless. After playing Half-Life 2, which I find quite similar to other games out there, I have to seriously question the concept of reality. Playing a video game seems more like driving a car, with stress on reaction time, reflexes, muscle control, etc. And the only remotely realistic graphics are the reproductions of human weapons. But the vast majority of the time, I am using them to shoot at a jumping, duffel bag sized 4 legged spider, from another world. This, I hate to say it, would feel significantly more realistic if Elvis appeared in the game, being returned by the aliens, as there would then be at least one relatable human character.
</p>
<p>
Rather, I argue that exactly the opposite is true - video games, rather than making violence more real, a part of our daily lives, make it seem considerably LESS real. Shooting oversized spiders feels totally unrealistic, but I can totally see how a young child, for whom this experience forms a significant portion of his experience with violence in general, seeing violence in a totally different light than those who lived to see World War II or Vietnam (note - I am 20, and have thusly seen neither, but I also am not a gamer, so by far most of my experience with violence has been through studying these events). Violence to them will seem automatic. They will not recognize the terrible consequences of their actions. When they walk through a hallway, the game teaches them not to think twice before pulling the trigger. But in a video game, this is acceptable, because none of the effects of these actions are present. There is no funeral for the dead, no family left behind to morn them. No wife and children at home who suddenly now must struggle for food and basic needs, as their primary provider of support no longer exists. These people never learn to associate the action of firing a gun with the heartbreak of a lost love one. How can they? The game is not 'real'.
</p>
<p>
Half-Life 2 has an evil button - F9 (and it's compliment, F6). Hit F9 and your game reloads from the last save. Useful in the game, but not in life. If I could do this, I would have finished high school with a 4.6 GPA and graduated valedictorian (actually not, since my school district decided that declaring a valedictorian hurt too many people's feelings and so they no longer have one...but that is another rant). In life there is no F9 button. I cannot simply try the same jump across the bridge 8 times, since after the first try, I am dead. I would think people would be smart enough not to be fooled by something this completely obvious, but I live in a country where more than half the people still think, as of this witting, that Sadam was involved in 9/11, despite the fact that Sadam and Osama hated each other...these people are demonstratedly not particularly sharp. Even the bad guys come back to life!
</p>
<p>
When we talk today about war, the vast majority of Americans are not directly affected. Sure most people either know or are related to a soldier, but by far most of us do not live within the direct company (partner, child, parent) of a military person. Given this, I don't believe America today shares the feeling of pain that comes when two dressed guards at your door present you with a triangular folded flag, salute you, and tell you that the one you once loved so dearly and knew so well will never again pass through that door alive.
</p>
<p>
If we did, we might think twice before committing our men and women, our sons and daughters, our fathers and sons, to fight for their lives. I place deep value and respect in those men and women who fight or have fought to preserve American freedoms and values. But cheap oil and a wealthy upper class are not among those values.
</p>
<p>
Anyways, as for video games, I don't have a complete solution. If I did, I would be doing something better than writing this stupid blog post. I can tell you that censorship is not the solution - our soldiers have fought for our freedom from censorship, and it would be a grave disrespect, an equal lapse in reality, to give it up.
</p>
<p>
I can say that, by far, the biggest part of any solution must be education. People need to learn, as I have, the effect of violence throughout history, so that instead of their views on violence being shaped by how many of those damn spider crabs they could "frag" if they had a bigger grenade, they would be "shocked and awed", on a very deep and profound level, by the massive devastation and loss of live that took place during events like the D-day invasion and the dropping of the atomic bombs. I was lucky enough to have a very good education - I was in the advanced tracks the whole way. I learned these things and studied them. I wrote papers, viewed images, watched documentaries, and read (good) books. My young sister some time ago said 'We use nukes all the time in Iraq! Nukes are awesome'. Clearly, something is wrong.
</p>
<p>
I still play Half-Life 2, and Halo, since I am able to separate mindless killing of alien spiders from the destruction of human life in the real world. Fair enough. The problem is, I doubt very much, that the vast majority of the video gaming community can. That is the real problem with video game realism.
</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-39217518463249727312008-05-21T04:25:00.001-07:002008-06-02T01:58:12.497-07:00Modern PC Games Suck<p>Let me start by saying I'm not a gamer. I don't get off by building a $10,000 gaming rig full of nodded parts running slimed-down operating systems designed to run Unreal and Crysis at mad frame 120fps. I don't even have a 'gaming rig'. I run games on a dual-core p4 (Centrino), 2 ghz, 2gb of cheap memory, Vista powered laptop with no real graphics card (Intel Mobile 945GM Express Chipset Family to be exact, a piece of junk), no real sound card, and not much else. This is the same machine I use for everything else, which is basically the point. I also do not have an console games. I have no xBox, Playstation, or Wii. In point of fact, I don't have a TV either.</p><p>
</p><p>I am a gamer in the sense that I'd like the ability to occasionally distract myself from what I'm doing by firing up a simple program and blowing the hell out of stuff. When I was younger, I enjoyed games, but never seriously, only as a supplement to other amusement activity.</p> <p>
</p><p>I have not played many games. The last popular game I played was Halo 1. Before that was Age of Empires 2 and Starcraft. So, recently, I decided I was tired of playing such outdated games. Sure Diablo 1 is fun, but I can only smash so many of those little demon critters before you eventually lose interest and start thinking about how to write a piece of software to automate it. The instant I start thinking about software, macros, or other automation shortcuts, I know the game has lost its entire point, as I am not immersed in it, and my mind is not actually distracted.</p> <p>
</p><p>But I was unsure about how my system would react to more modern video games, as it is certainly not a power-gamming rig. So, after (very stupidly) losing my CD key to halo 1 for PC, I undertook to try (via the demos) out a set of what seemed to be the most popular modern video games and determine which was the best. My plan was then to purchase one or two of my favorites. I am so glad I took this step, however, as I have been sorely disappointed by the results.</p> <p>
</p><p>First up is the lineup of games:</p> <ul> <li>Bioshock for PC - Download via fileplanet link, found at offical website.</li> <li>Tomb Raider Legends - Downloaded via Steam</li> <li>Tomb Raider Anniversary - Downloaded via Steam</li> <li>Half-Life 2 - Downloaded via Steam</li> </ul> <p>There were more, but I gave up after these.</p> <p>
</p><p>First off was Steam. I downloaded Tomb Raider Legends and tried to install it. It failed, saying it couldn't run some hyper-sensitive security (read: DRM) component. Fine, one down, lots to go...Vista's known for causing this kind of problem....</p> <p>
</p><p>Next I downloaded Half-Life 2, and got a similar fail. So the problem may not be the games, but steam itself. FAIL! I'm going to try to download this on its own as a last resort, but as the download page says it still requires a steam account, I don't have very high hopes (my only hope is that the developers were smart enough to correctly package their restrictive DRM 'feature' in their demo executable where the fools at steam were not).</p> <p>
</p><p>So with steam a fail, I then tried to download Tomb Raider Anniversary direct. And I was greeted with my first and only success of the night. It worked. Not really well, but it did work alright. But Tomb Raider is not exactly my first choice for game play. It is interesting and challenging, but it's puzzle-heavy play is anything but mindless, which is more what I was looking for, and while the visuals are nice, they are not unique and novel now that we have Google image search.</p> <p>
</p><p>So that left only Bioshock, which was supposedly an excellent game. Sadly, the external link was to a members-only section of fileplanet. I hate fileplanet. But, I really wanted to play a video game. So I signed up, and I even downloaded their crapware IE plugin. And, two hours and several thousand ads displayed on my screen later (none of which I saw, as I was busy doing other, more interesting things), I had the installer. I unzipped and ran it, and was so happy when it said it had installed! But it was not to last. The first run comes up with a dialog saying my videocard is unsupported, and the result will be unpredictable. Sure enough, five seconds later, the result is a complete program crash, without even a hint of a splash screen. No games for me.</p> <p>
</p><p>At this point I've invested four hours and the best I've gotten to run is a Tomb Raider game. Great. Discouraged, I gave up, for the night. I'm not sure what I will do now, but I will probably either move back to old games, buy Halo 2, or maybe even just buy a console and a TV input card to use my PC as a display. You can't beat a console on simplicity, and that's what I want. Consoles are expensive, however, and I want to buy more MIDI gear for my studio.</p> <p>
</p><p>It shouldn't be this hard. No wonder every tech writer and their Grandma is predicting the death of PC gaming, as they finally have something in computing that neither of them can figure out.</p><p>
</p>
<p style="font-weight: bold;">[Update]</p>
<p>
Since then, I have been able to get Half-Life 2 to run dispite its dependance on Steam, and I've decided I will buy OrangeBox, as I really did want a mindless video game to play. But I'm still not very happy here.
</p>
<p style="font-weight: bold;">[Update 2]</p>
<p>
I have beaten Half-Life 2, and Episode One. I cannot load Episode 2, however, which I think is the fault of the new source engine...oh well, I give up.
</p>Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0tag:blogger.com,1999:blog-323747950029293302.post-4101272432882076022008-05-21T03:34:00.000-07:002008-05-27T01:32:33.927-07:00Welcome to Midnight Sodapop TechnologyI am your host, Matthew Schultz. I am a Software Developer for several small startup companies, a technologist, a fractal artist, a musician, and several other things, depending on the current need/how much I've had to drink.
I am entirely addicted to Dr. Pepper, and I drink completely excessive amounts of it, especially at night. I live a semi-nocturnal lifestyle, and am most productive, and most outspoken, at night.
I am a bit abrasive, and I have rather firm opinions. I don't mean to insult, usually, but if I do, please let me know, and I will correct as appropriate.Matthew Schultzhttp://www.blogger.com/profile/09286149739524424565noreply@blogger.com0