Wednesday, May 28, 2008

'Old' Tech Still Has a Few Tricks Up Its Sleeve...

I've been a technology/gadget freak almost my entire life, and I remember at an early age being curious about how the technology around me (TV's/Radios/Remote Controlled cars) worked. This of course influenced my career path, and led me to a Computer Science degree. However, I've never lost my love of all forms of technology, especially radio. This love of technology has also intersected with the desire I've developed to volunteer in emergency services/communications roles outside of my normal job (a desire cemented after I lost a colleague and friend on American Airlines flight 11 on 9/11).

Two years ago, I studied for and received my Amateur Radio (Ham) technician class license (K6GWM). At the time, some of my tech friends questioned the decision for me to invest in such 'antiquated' technology. However, experience in emergency situations since that time has taught me time and time again that the saying: 'When all else fails, Amateur Radio' is absolutely true. While I'm as much of a cell phone/computer geek as the next guy, I've experienced first hand how fragile those networks and connections can be. Amateur Radio, and commercial radio systems used by public safety agencies are the only truly foolproof communications methods that will work in a disaster situation. During disaster drills, I've had to explain that fact a number of times to well-meaning, but ignorant, city officials, who claim that their 'walkie-talkie' Nextel phone will be more than adequate should the 'Big One' hit.

Now, before everyone labels me a tech Luddite, I'd like to point out that there are some great examples of the synergy of old and new tech, especially in public safety agencies like the fire department. One of my volunteer roles is as a Volunteer in Prevention (VIP) with Cal Fire (California Department of Forestry and Fire Protection). Cal Fire, like other agencies of its kind, relies on dedicated commercial radio systems to perform their engine/crew dispatches, and keep in contact with fire protection resources. However, they have evolved to include new technology in all of their operations, from Geographic Information Systems (GIS), to cell phones, satellite phones, pagers, and email/instant messaging. Our VIP group in the Cal Fire Santa Clara Unit has built up and staffs a fully mobile command center, with a mix of both Ham and fire radios, as well as a satellite phone connection, full ethernet switch, and a multi-function printer/scanner/fax machine. We also keep enough connectors of every conceivable size and shape on hand, to tap into local landlines, etc. Since Cal Fire is primarily known for working in wildland areas, the ability to bring both old and new tech to bear on a problem is critical.

As an example, on the recent Summit Fire, I volunteered in the Emergency Command Center, answering the information/media line, and I experienced first hand how this flexible approach to technology helped fight this fire and keep folks informed. In the more remote areas that this fire burned in, there was very spotty cell phone coverage. The dedicated fire radio system kept all of the firefighting forces in realtime contact. However, internal communication among the command staff (including coordinating of evacuation areas) was handled over the Research in Motion Blackberry network from the incident command post and Emergency Command Center. GIS systems were employed to map the fire, and put up nightly Google Maps images of the fire containment lines. During all of this, those of us manning the information center passed information to the public on landline phones (still a critical 'old tech'). When one of my roles on the first day became to be an information point at the front of Santa Clara Unit Headquarters (fielding media queries, and directing incoming command staff to the command center), I relied on both my cell phone and my Ham radio to stay in contact with my supervisor and my teammates.

All of these experiences have reinforced the tenet that I think we should all remember - 'old' tech should always be evaluated to determine how it can best serve in conjunction with newer systems, not just dismissed out of hand because of its age.

Tuesday, May 20, 2008

Open Source and Corporations - Can't We All Just Get Along?

Corporations and the Open Source community seem to have a love/hate relationship most of the time, and I think it stems from several key misunderstandings (mainly on the part of the companies). A lot of companies (though there are notable exceptions, such as IBM) seem to look at Open Source as a source of free (as in beer, not as in speech) code/labor. In other words, management sees that code and those developers as a great way to shortcut the product development process, do it for less, and then 'profit ensues'.

I know that sounds funny to a lot of technology people (like me) who are familiar with and involved in the Open Source community, but that mentality is very real in larger/older companies (especially those in consumer electronics sectors). It doesn't help that marketing and PR grab ahold of the notion of Open Source as a great 'hook' to make their company or product offering sound sexy and appealing. The unfortunate reality is that a lot of corporate execs consider Open Source to be a 'risk management exercise' (yes, I actually had an executive that I met one time tell me this).

Now, lest anyone think I'm picking solely on the corporate guys, the Open Source community is complicit in this as well. In larger projects inhabited by ideological zealots that believe Open Source/Free Software is the end-all/be-all (think Richard Stallman), the blinders get put on when it comes to the reality that businesses are in business to make money. Making money is not a bad thing, but the notion that all code should be free and open flies in the face of that for these people, causing them to spend lots of time, effort, and money chasing down every single last license violation that they can. I prefer to recognize the needs of both the corporations and the individuals and find a way for both of them to satisfy the WII FM (What's In It For Me) notion.

Now, let me be clear - I don't think Open Source license violations should be tolerated, but I'd like to see a lot more emphasis on finding 'win-win' situations between corporations and the community. The reduction in legal expenses alone should be incentive enough for both sides to try and come to amicable solutions. Below are some "Do's and Dont's" that I hope will give both sides some ideas on how to accomplish this:


DO have a plan to work with Open Source communities - don't grab the code, fork it, and then go along your merry way. It costs you more money in the long run (for bug fixes and maintenance), and you lose at least 50% of the value of working with Open Source in the first place. Remember that 'compliance' and 'risk management' are only half of the plan - community interaction is at least as important.

DO participate actively in the community, whether it is mailing lists, forums, or other forms of communication. Make your voice heard, but remember you are but one voice in the community, and you are rarely the driving factor in decisions.

DO utilize Open Source in a way that you can treat it as a commodity - use your engineering resources on more value-added projects, since it is likely you will not be able to keep up with the amount of code that a good community project can produce.

DO re-evaluate if you truly have critical 'intellectual property' in a piece of code. If you don't, and the community is willing to extend/support that code, consider putting it out with an appropriate license to let them do so.

DON'T slight the community, and do your part for the project (if you find a bug, and determine it isn't fixed, offer to fix it and contribute the patch back).

DO try to keep up to date with the latest code revisions (both for your own regression testing and to allow you to contribute back fixes easily).

DON'T try to make up your own 'Open Source' licenses by attempting to modify an existing Open Source license for your purposes - the community will reject it out of hand, and your credibility will be zero.

DO remember that, except in certain circumstances, these developers do not work for you - plan your release cycles and development processes to take into account the variability of the community you are working with.

Open Source Community:

DO have patience with companies as they work through a ton of internal issues related to being better open source citizens.

DON'T immediately call out the attack dogs for perceived license violations - attempt to work with the company.

DO understand if companies choose to architect things in a way to avoid having to give up intellectual property through "license taint". See comment above about 'making money'.

I believe we need to get to a place where both the Open Source community and the corporate world work together consistently for the greater good of software. It is a difficult challenge, but one well worth expending the effort for.

Friday, May 16, 2008

Silicon Valley Prognosticators Convention

I recently attended the Churchill Club's annual 'Top 10 Tech Trends' event at the Fairmont Hotel in San Jose, CA. I had been looking forward to this night, since I've attended previous club events, but never this one. I wasn't disappointed, as the panel of tech prognosticators that they brought in was informative, entertaining, and had a good mix of opinions.

I'll lay out all of the trends below, and comment on a select few that interested me the most. This was a panel format, with each panelist getting time for two trends that they could put forth as their top thoughts on the future. After each presentation, the panel got to 'vote' ('red' or 'green'), and explain their reasons. An audience poll was also taken after the panelists were finished discussing, and this was recorded as well. The panel was a 'Who's Who' of Silicon Valley tech experts, venture capitalists, and technologists. The panelists for this evening were:

The evening was moderated by Tony Perkins, Creator & Editor-in-Chief, AlwaysOn. The ten trends presented were:

  • Demographics are Destiny - Tech-savvy 'Boomers' getting older creates opportunities to harness their knowledge (Jurvetson)

  • Device That Used To Be a Phone - Mobile phones turning into mainstream computers to hold all of life's data (Khosla)

  • Rise of the 'Implicit' Internet - How the harvesting of your 'digital trail' could yield useful mashups to make your life easier (Kopelman)

  • Migration of Mobile Phone Market from 'Feature Phones' to 'Smart Phones' - many implications to OEMs and carriers (McNamee)

  • Water Tech Will Be More Important Than Global Warming - Our current focus on global warming may be less important than insuring clean water supplies for everyone (Schoendorf)

  • Evolution Trumps Design - Utilizing evolutionary processes to design better technology (Jurvetson)

  • Fossilizing Fossil Energy - Traditional energy sources (oil, natural gas) will have increasing competition in the form of more economically produced biofuel (Khosla)

  • Venture Capital 2.0 - VC funds have had to grow larger to continue to make a profit & are shifting away from IT/Software opportunities or other tech (Kopelman)

  • Within 5 Years, Everything You Need Will Be On Your Device - result will be more internet traffic to/from mobile devices than PCs (McNamee)

  • 80% of World Population Will Carry Mobile Devices in 5-10 Years - wireline phones/dial tone will be a thing of the past (Schoendorf)

The first trend out of the gate was one that is near and dear to our family, since my wife is a baby boomer. The idea that we will soon have a large population of older Americans that are tech savvy will present some interesting opportunities, both for corporations, and the boomers themselves. Jurvetson theorized an 'eBay-style' market for information processing or technical services. I could even imagine using the accumulated knowledge as a giant 'human supercomputer'. For the boomers themselves, being able to do productive work with sharper minds will provide both financial and health benefits, since studies have shown that those with stronger, agile minds are likely to live longer and suffer less from diseases of aging.

I'd like to address two of the very similar mobile phone trends together - 'Device That Used To Be a Phone', and 'Everything You Need Will Be On Your Device'. The panel and audience seemed split on both of these, mainly because they were not seen as 'future-enough'. I will agree to a certain extent that in places such as Asia and Europe, the concept of a phone that replaces your credit cards, ID, keys, etc. is much further along. However, I believe that Khosla's key element in his trend prediction is one of all of your mobile phone data living in the network 'cloud'. If we are going to rely upon these devices to hold all of our lives (and given the cultural bent toward Ludditism in North America, I'm not sure this will happen soon), there has to be a clean and easy way for you to 're-provision' a new phone should you lose or break your existing device. Even mobile phone carriers should applaud this one, as it would make it much easier to provision new customers, as well as encourage 'upgrades' without the pain that accompanies a phone replacement today - re-provisioning on the network, helping customer transfer address book, calendar, etc. I know that I would appreciate something I could carry with me to replace all of the things I currently shove in my pockets when ready to walk out the door. Putting the data from these devices into the network, and applying appropriate security measures (biometric, plus something you know) would go a long way toward making this future a reality.

The one trend that really resonated with me was Kopelman's 'Rise of the "Implicit" Internet'. His proposal of harvesting the 'digital breadcrumbs' of one's Internet existence is squarely in the cross hairs of Social Knowledge Management - the ability to utilize data from disparate sources that is authored both by you, and others. There have been numerous attempts to break down the silos of information that exist about you on the net (Google search results, Facebook,, etc.). Some of these have been more neutral (FOAF - Friend of a Friend), and some have been commercial (OpenSocial). I believe the first standard to be adopted in this space will have huge impact in terms of both personal productivity as well as monetization opportunities by companies. This could also help usher in the era of Web 3.0 - the ability for computers to take over some of the more mundane tasks that we do 'by hand' now (travel booking, for instance). McNamee brought up the security concerns associated with this data harvesting (which are valid, to a degree), but was shot down by Khosla, who called the argument a 'red herring', and said that we have the technical know-how to make these kinds of systems secure.

Leaving aside the Water Tech and Fossilizing Fossil Energy for now (not because they aren't important, but because I believe we have the impetus and technology wherewithal to get those done), I'd like to close by spending some time on Jurvetson's Evolution Trumps Design. The idea of using evolutionary processes to help build better chemicals is not necessarily new, but the idea of utilizing the same processes in computer science and artificial intelligence for evolutionary computation is still reasonably recent in practice (even though research has been going on for quite some time). The idea of using evolutionary theory to help build better programs by cycling through several computer programs and winnowing the group down to the best one is fascinating to me, and I would hope that it would eventually lead to more stable software overall. It would also remove some of the 'political' aspects of technology solutions, if you could prove that the system(s) chosen were the result of 'natural selection'.

This night was very informative and entertaining, and I enjoyed meeting a lot of new folks at dinner and in the hallways. I look forward to next year's trends dinner, and hope to bring my blog readers more interesting content from other Churchill Club events. Thanks for reading!

Tuesday, May 13, 2008

In Search Of...

In today's installment, we go in search of... well, code. This being a technology blog, searching for answers in codes seems like a natural thing to do. :)

In the last year or so, I've become more convinced that detailed, accurate code search technology is critically important to developer productivity, code reuse, and helping to squash pesky bugs on projects/pieces of code that have subtle inter-dependencies. I also believe that, properly implemented, social knowledge management/networking can be applied to further enhance the benefits of accurate code search.

First, let's start off with the major players:

Google does a fairly good job of indexing existing Open Source code from the net, but their interface relies too heavily (in my opinion) on regular expressions, and doesn't provide the flexibility of search targets that Krugle and Koders do. Additionally, Google currently doesn't offer an Enterprise version of their tool for 'behind the corporate firewall' usage.

Now is the time for full disclosure - in my role at my employer, I helped evaluate Koders vs. Krugle. While they both provided a lot of similar functionality (easier search interface than Google, Enterprise versions, limited reporting, syntactic understanding of different programming languages), I ended up recommending Krugle because of the following factors:

  • Deployed as a network appliance - based on Open Source components like Linux and MySQL

  • Early stage development - the ability to help shape and define this product going forward

  • Innovative features such as 'tagging/annotating' files, link creation for sharing, and the 'Show Clones' option

  • The ability to have 'related information' easily at hand in the interface (SCM checkin comments, tech pages, related code from external sources)

Now, to be fair, it hasn't been a completely smooth ride during our Beta deployment of Krugle, BUT the team from Krugle has been very responsive to our needs, and have queried us on numerous occasions for input into their product planning process. In a lot of respects, it has felt more like an Open Source community engagement than a traditional vendor/customer model, and I think that in a nascent field like code search, this is extremely important.

Developers in my organization have given a mostly positive review to this initial trial installation, and have provided some feedback which we have shared with Krugle. I mentioned earlier the social knowledge management/networking aspects I'd like to see applied to code search. These fall into categories such as 'voting', 'tagging', 'enhanced annotation', and 'collaboration/discussion'.

I'd love to see the ability for users of systems like Krugle to 'vote' a piece of code up or down. Doing this would affect not only the order a piece of code was returned in while searching, it would give the searcher some measure of what others thought of that particular source code. While Krugle and Koders do provide a rudimentary form of 'tagging', I'd like to see the ability of that tagging infrastructure to be pluggable, to allow integration with social bookmarking sites such as, or internal installations of Open Source equivalents like Scuttle.

Krugle already provides for 'notes' or 'annotation' to a piece of code, but I'd like to see the ability to have URLs added to these notes, and have those automatically show up in the related information 'channels' that are already provided. Finally, the ability to start ad-hoc collaboration 'conversations' (forums, IM chats, shared code viewing sessions) directly from a file listing, or even from search results, would be a useful feature, especially for distributed development teams.

I believe that code search technologies are on the verge of drastically improving developer productivity, but there are cultural barriers that still need to be broken down in enterprises, since peer review and collaboration are not as widely ingrained in those types of development organizations as they are in the Open Source world. I certainly hope that systems such as Krugle can continue to evolve and provide value to independent developers and organizations alike, while helping to break down those barriers.

Wednesday, May 7, 2008

Dash Express GPS Software/Traffic Model Updates

As most of you might guess from reading this blog, I'm big on social collaboration and 'crowdsourcing'. I've mentioned Dash Navigation in the past, since I think they epitomize that notion quite well. Gizmodo had a blurb about Dash's first major software/traffic model update today.

Quoting from the article:

"The Dash Express GPS just received its first historic traffic model update using the live Dash data gathered by users. That'll help predict traffic in areas where no Dash or other trusted data sources have been in the last 15 minutes. By end of month, a software update is coming with tweaks in performance, stability and routing."

I for one certainly hope that Dash becomes more successful and ubiquitous, not just for the cool collaborative traffic and their other features, but mainly for the ability to update maps over-the-air. I have a Nissan Murano with a built-in navigation system that works well (for the most part), but on a trip to Reno this past weekend, there were two huge gaps in the map data which caused us to have to extrapolate our position manually. While this is at most a minor inconvenience, the worst part was that we just recently purchased the new map DVD for $190, and then they turn around and send us an email for the 'updated' DVD at another $190. Besides annoying me no end, this makes me yearn for Dash's much simpler (and cheaper) alternative for map updates.

I'm watching Dash closely, and may take the plunge once the hardware gets slimmed down a bit (which it inevitably will). These guys have the right idea, and if they can stick to their guns and resist the urge to compete feature to feature (with Bluetooth dialing for example) with the Garmin's and TomTom's of the world, I think they'll be alright. Most people don't need or want 50 features, just 5 that are killer and work flawlessly.

Thursday, May 1, 2008


My colleagues and I have a standing joke when we talk about what makes social networks/collaborations/knowledge management work - it's everyone's favorite radio station, WII FM (What's In It For Me). To be sure, there are some social/crowdsourcing applications out there that people contribute to purely for altruistic reasons, or to share things with their friends. However, the vast majority of attempts to harness the 'group collaborative', or 'hive mind' for business purposes or knowledge management have failed unless there was a clear win in the WII FM department.

Two examples of applications where WII FM makes a huge impact are (social bookmarking), and Dash Navigation (crowd-sourced realtime traffic). In both of these cases, there are clearly benefits to the users of the respective applications, and the participation engendered by that benefit is the main reason the systems produce good data (search and traffic, respectively). It is very rare that a large enough proportion of people will contribute to a 'group data/application store' without seeing a significant, and more importantly, immediate benefit to themselves. At this point, one might argue the Wikipedia instance, and while it is true that the crowd-sourced encyclopedia is quickly supplanting (if it hasn't already) most 'old-fashioned' references, there are intrinsic rewards that drive people to contribute to Wikipedia, not the least of which is seeing their name 'in lights'. I would argue that contributing to such efforts also helps one stumble upon other areas of interest/exploration, so that would satisfy the WII FM theory.

Also, if you look at Open Source closely, you'll also see instances of WII FM, despite the popular press's characterization of the movement as 'peace, love, free code for all'. To be fair, there are a number of Open Source projects that do exhibit those traits, but a large portion of Open Source code is still driven by the WII FM mentality, especially when big corporations become involved. 'Community goodwill' is never the primary motivation for any corporate participation in Open Source, despite what the press releases and PR spin may lead you to believe. Despite this, I believe that 'What's In It For Me' doesn't necessarily have to be a bad thing, as long as the needs of the community as a whole can be served by what certain corporations desire in the project. For the latest case of what could be an example of this, if it is pushed a little more by the big vendors, check out this Slashdot article on PC vendors 'strongly encouraging' suppliers to provide Open Source drivers. To be sure, this takes some of the onus off of the Dells of the world for support, but it also opens things up for the community to tinker/fix/enhance such drivers. In short, WII FM for both sides.

WII FM isn't bad in and of itself, but it certainly pays to be aware of it when evaluating various technologies or platforms, be they social networking/collaboration, Web 2.0, or Open Source. Understanding the ramifications of this idea can help you avoid big mistakes as you are rolling out a project, or choosing a technology direction.