(Updated 10 Dec 2010 -- corrected link to 3rd Place Hilversum developer Kornel Lesinski's Twitter page.)
Last month, more than 50 developers assembled in Hilversum, Netherlands, and San Francisco, California for an HTML5 game jam.
The idea of HTML5 gaming may seem unusual, but if the results from this event are anything to go by, there will be plenty more HTML5 games in the future. In just over 24 hours of coding, attendees were able to produce the seeds of great games, powered by standard web technologies. The games we saw were novel, visually appealing, and in many cases, already very playable.
HTML5 is making it easy to develop games for standard web browsers, and it also provides a way for developers to reach mobiles and tablets with a single code base. Watch for other initiatives, like Mozilla's current HTML5 gaming competition, to take HTML5 gaming to the next level.
Here’s a look at the winners from both venues. You can see a detailed list of all the entries here.
A novel 8-bit style game where you “leap” over the bad guys. A good demo of the Canvas element and a complete game with levels and scoring. Congratulations David Ganzhorn and Mike Rotondo on winning the HTML5 Game Jam in the USA.
A puzzle game where you build a fortress to protect the monkey, demonstrating a physics engine in Canvas. Congratulations Tom Hastjarjanto on winning the HTML5 Game Jam in Europe.
A platform shooter involving turtle-like creatures on wheels, using Canvas. By Wolff Dobson, Charles Lee, Nicolas Coderre, Dan Fessler, Sara Asher. (No online demo at present.)
A refresh on the classic “Snake” game, demonstrating multiplayer powered by NodeJS and WebSocket, and 3D transforms of the canvas element. By David Durman & Ales Sturala. (No online demo at present, but code repository available.)
A casual puzzle game by Bruno Garcia, where you link up adjacent matching fruit.
A stunning 3D game inspired by the classic Syndicate series showcasing just how far we’ve come with Canvas-based graphics. Observe the collision detection and be sure to hit the “Flying Carpet” button as well as the space bar to fire! This game was also shown in the “Web or Native for Mobile Development?” session at the recent Google Developer Days conferences in Europe. Created by Kornel Lesinski, Peter van der Zee, and Edwin Martin.
A few other readily playable games you might enjoy are:
We were also honoured to have keynotes by two pioneers of web-based gaming. In Hilversum, the speaker was Tino Zijdel, creator of DHTML Lemmings back in 2004. Tino, coincidentally a Hilversum local, explained the tricks he used to make the game playable on the browsers of the day. He has subsequently written his account of the Game Jam. It’s in Dutch, so here’s an English translation. There were additional presentations from from Yu Jianrong, who covered ten tips for HTML5 Game Development and Paul Irish on HTML5.
The San Francisco keynote was given by Marcin Wichary, who gave a keynote on games and HTML5. Marcin is the creator of the Pac-Man doodle and also the first version of the popular HTML5Rocks slides. Marcin talked about his experiences in recreating Pac-Man and the timeless aspects of videogaming in modern age, shared some behind-the-scenes trivia, and shared the technology used to write the doodle and debug it.
We thank SPIL Games for hosting and co-organising the Netherlands event, and we also thank Samsung for contributing a Galaxy Tab for the Game Jam at that venue. Developers working on touch apps were able to use the Tab for testing, and we later gave the device away as a prize. Congratulations all who took part!
You can find more details about the event, including links to code repositories and further demos, at HTML5GameJam.com.
More and more websites are enhancing their login systems to include buttons for identity providers such as Google, Yahoo, Facebook, Twitter, Microsoft, etc. Users generally prefer this approach because it makes it easier for them to sign up for a new site that they visit. However if a user already has an account at a website, and they are used to logging in with their email and password, then it is hard to get them to switch to using an identity provider.
Google has recently released a sample site that shows how a website can migrate users away from password based logins, and instead have them leverage an identity provider. This sample site incorporates many of the ideas of the Internet Identity community, as well as feedback from numerous websites who have been on the cutting edge of applying these techniques. The following video provides highlights of some elements of the user experience.
The sample site is at openidsamplestore.com, but we suggest first reading this FAQ which describes the site and has links to additional videos of some of the features. We hope website developers will use these techniques to reduce the need for passwords on their site.
Twenty years ago this month, Tim Berners-Lee published his proposal for the World Wide Web. Since then, web browsers and web programming languages have come a long way. A few of us on the Chrome team decided to write an online guide for everyday users who are curious about the basics of how browsers and the web work, and how their evolution has changed the way we work and play online. Called "20 Things I Learned about Browsers and the Web," this online guidebook is illustrated by Christoph Niemann, and built in HTML5, JavaScript and CSS3, with our friends at Fi.
In building an online book app, HTML5, JavaScript and CSS3 gave us the ability to bring to life features that hearken back to what we love about books with the best aspects of the open web: the app works everywhere, and on any device with a modern browser. Here are a few features of the book experience that we’re particularly excited about:
This illustrated guidebook is best experienced in Chrome or any up-to-date, HTML5-compliant modern browser. We hope you enjoy the read as much as we did creating it, at www.20thingsilearned.com or goo.gl/20things.
Google Person Finder has become a useful tool in responding to natural disasters by reconnecting people with their family and friends. We’ve been looking at the next phase of Google Person Finder and decided to begin hosting the open source project at Google Code. We’re inviting the developer community to help improve Google Person Finder and the PFIF data format.
Google Person Finder provides a common place to search for, comment on, and connect records from many missing person registries. After the January 12th earthquake in Haiti, a team of Googlers worked with the U.S. Department of State to quickly create a site that helped people who were affected by the disaster. The site was used heavily after the Chile earthquake in February and put in action again in April after the Qinghai earthquake in China and in August for the Pakistan floods.
The software powering Google Person Finder is open source so we’re listing the open issues and feature requests we’ve received over the past few months in hopes the community can help us improve the code. We’ve created a Developer Guide to help developers get started. As always, we invite those interested to post questions on our public Person Finder discussion group. Those who are interested in improving the PFIF data format can also join the PFIF discussion group.
In addition to opening our product for developers, we’ve decided it’s now time to turn off our Google Person Finder instances for Haiti, Chile, China, and Pakistan. It doesn’t seem useful to be serving these missing person records on the Internet indefinitely, so we intend for each instance of Google Person Finder to be running for a limited time. Once an instance has served its purpose, we will archive the PFIF records in a secure location for historical preservation for one year while we work to identify a permanent owner for these records. Assuming a long-term owner cannot be found, we will delete the records after one calendar year. For more information, please feel free to review the Google Person Finder FAQ.
If you’ve used Google Search recently, you may have noticed a new feature that we’re calling Instant Previews. By clicking on the (sprited) magnifying glass icon next to a search result you see a preview of that page, often with the relevant content highlighted. Once activated, you can mouse over the rest of the results and quickly (instantly!) see previews of those search results, too.
Adding this feature to Google Search involved a lot of client-side Javascript. Being Google, we had to make sure we could deliver this feature without slowing down the page. We know our users want their results fast. So we thought we’d share some techniques involved in making this new feature fast.
This is nothing new for Google Search: all our Javascript is compiled to make it as small as possible. We use the open-sourced Closure Compiler. In addition to minimizing the Javascript code, it also re-writes expressions, reuses variables, and prunes out code that is not being used. The Javascript on the search results page is deferred, and also cached very aggressively on the client side so that it’s not downloaded more than once per version.
When you activate Instant Previews, the result previews are requested by your web browser.There are several ways to fetch the data we need using Javascript. The most popular techniques are XmlHttpRequest (XHR) and JSONP. XHR generally gives you better control and error-handling, but it has two drawbacks: browsers caching tends to be less reliable, and only same-origin requests are permitted (this is starting to change with modern browsers and cross-origin resource sharing, though). With JSONP, on the other hand, the requested script returns the desired data as a JSON object wrapped in a Javascript callback function, which in our case looks something like
google.vs.r({"dim":[302,585],"url":"http://example.com",ssegs:[...]}).
Although error handling with JSONP is a bit harder to do compared to XHR (not all browsers support onerror events), JSONP can be cached aggressively by the browser, and is not subject to same-origin restrictions. This last point is important for Instant Previews because web browsers restrict the number of concurrent requests that they send to any one host. Using a different host for the preview requests means that we don’t block other requests in the page.
onerror
There are a couple of tricks when using JSONP that are worth noting:
At this point you are probably curious as to what we’re returning in our JSONP calls, and in particular, why we are using JSON and not just plain images. Perhaps you even used Firebug or your browser’s Developer Tools to examine the Instant Previews requests. If so, you will have noticed that we send back the image data as sets of data URIs. Data URIs are base64 encodings of image data, that modern browsers (IE8+, Chrome, Safari, Firefox, Opera, etc) can use to display images, instead of loading them from a server as usual.
To show previews, we need the image, and the relevant content of the page for the particular query, with bounding boxes that we draw on top of the image to show where that content appears on the page. If we used static images, we’d need to make one request for the content and one request for the image; using JSONP with data URIs, we make just one request. Data URIs are limited to 32K on IE8, so we send “slices” that are all under that limit, and then use Javascript to generate the necessary image tags to display them. And even though base64 encoding adds about 33% to the size of the image, our tests showed that gzip-compressed data URIs are comparable in size to the original JPEGs.
We use caching throughout our implementation, but it’s important to not forget about client-side caching as well. By using JSONP and data URIs, we limit the number of requests made, and also make sure that the browser will cache the data, so that if you refresh a page or redo a query, you should get the previews, well... instantly!
There’s an exciting new event happening December 6th dubbed the “Woodstock for Cloud Developers.” We’ll be participating at Cloudstock, an industry event taking place in San Francisco’s Moscone West, that brings together a growing developer community and some of the leading cloud technology companies (such as Google, vmware, Salesforce.com and Amazon) to learn, hack and network.
Google is a strong believer in the open technologies powering the web, such as HTML5. Cloud computing is about powering innovations on the web with platforms and services that make developers like you more efficient and allow you to concentrate on solving business problems. No longer do you have to worry about the hassle of acquiring and managing servers, disks, RAM and CPU-- it’s all accessible in the cloud.
Google will be presenting the following sessions at Cloudstock:
We have another session which will be announced shortly-- stay tuned to this blog and the GoogleCode twitter account!
Register for the free Cloudstock event at:http://www.cloudstockevent.com/Moscone WestSan Francisco, CAMonday, December 6th, 2010
Looking forward to meeting you there!
Google Project Hosting is all about helping software developers work together on source code, code reviews, issues, and wiki pages for technical documentation. In fact, the projects we host have collectively accumulated several million issue comments. Working together means that, from time to time, the ball is in your court and you need to respond to other users.
We send out notification emails to let the appropriate users know when an issue has been entered or a comment has been added to an issue, wiki page, or code review. These emails contain a link that allows you to enter your response in your project on code.google.com/p. But starting now, we are making it much easier and faster to respond to these comments by processing email replies that you send us.
So, check your inbox for new notification emails sent directly to you. When you see an email footer line that says that you can reply, just press the reply button in your email client, bang out a thoughtful response, and hit “Send”. Project committers and owners can even update an issue’s status and other values via email. For example, to let your teammates know that you are working on an urgent defect report that just came in, reply with:
Please try it out the next time you receive a notification email. If you have questions, see our documentation on inbound email and user groups.
We know that developers are always interested in learning about new APIs and best practices for existing ones. And, one of the best ways to learn is face to face interaction with an expert in the subject.
Your friendly neighborhood Google Developer Relations team members work everyday with the APIs you care about. We host, as well as attend, a number of events around the world to help as many developers as possible throughout the year. However, it hasn’t been easy for interested developers to find relevant events close to them.
We also realized that while many developers have met at least a couple of our Developer Advocates, it’s hard to tie an Advocate to their API expertise.
Enter the Advocate Bios and Developer Events pages.
The Advocates Bios page provides names, pictures and short descriptions of Developer Relations team members. You can filter them by what they work on and/or where they’re based out of.
The Developer Events page is a mashup of the Calendar and Maps APIs, running on an App Engine backend. Want to know about upcoming Android events in Prague? Or whether Patrick Chanezon is speaking at the GDD in Munich on Nov 9th? (He is!) You can do all of that and more with the Developer Events page.
Both the bios and the events pages are conveniently linked under the Developer Resources section on the Google Code home page.
We hope to see you at the events!
Last year, as part of Google’s initiative to make the web faster, we introduced Page Speed, a tool that gives developers suggestions to speed up web pages. It’s usually pretty straightforward for developers and webmasters to implement these suggestions by updating their web server configuration, HTML, JavaScript, CSS and images. But we thought we could make it even easier -- ideally these optimizations should happen with minimal developer and webmaster effort.
So today, we’re introducing a module for the Apache HTTP Server called mod_pagespeed to perform many speed optimizations automatically. We’re starting with more than 15 on-the-fly optimizations that address various aspects of web performance, including optimizing caching, minimizing client-server round trips and minimizing payload size. We’ve seen mod_pagespeed reduce page load times by up to 50% (an average across a rough sample of sites we tried) -- in other words, essentially speeding up websites by about 2x, and sometimes even faster.
(Video comparison of the AdSense blog site with and without mod_pagespeed)
Here are a few simple optimizations that are a pain to do manually, but that mod_pagespeed excels at:
We’re working with Go Daddy to get mod_pagespeed running for many of its 8.5 million customers. Warren Adelman, President and COO of Go Daddy, says:
"Go Daddy is continually looking for ways to provide our customers the best user experience possible. That's the reason we partnered with Google on the 'Make the Web Faster' initiative. Go Daddy engineers are seeing a dramatic decrease in load times of customers' websites using mod_pagespeed and other technologies provided. We hope to provide the technology to our customers soon - not only for their benefit, but for their website visitors as well.”
We’re also working with Cotendo to integrate the core engine of mod_pagespeed as part of their Content Delivery Network (CDN) service.
mod_pagespeed integrates as a module for the Apache HTTP Server, and we’ve released it as open-source for Apache for many Linux distributions. Download mod_pagespeed for your platform and let us know what you think on the project’s mailing list. We hope to work with the hosting, developer and webmaster community to improve mod_pagespeed and make the web faster.