Posted by Erica Hanson, Program Manager
ARUA, UGANDA - Samuel Mugisha is a 23 year old university student with a laugh that echoes off every wall and a mind determined to make change. Recently he heard from a healthcare worker that many children at a local clinic were missing vaccinations, so he decided to take a walk. He toured his community, neighbor to neighbor, and asked one simple question: “Can I see your vaccination card?”
In response he was given dirt stained, wrinkled, torn pieces of paper, holding life or death information - all written in scribble.
He squinted, held the cards to the light, rubbed them on his pant leg, but for no use. They were impossible to read. As Samuel put it, “They were broken.”
From the few cards he could read, Samuel noted children who had missed several vaccinations - they were unknowingly playing the odds, waiting to see if disease would find them.
Without hesitation, Samuel got right to work, determined to fix the healthcare system with technology.
He first brought together his closest friends from Developer Student Clubs (DSC), a program supporting students impacting their communities through tech. He asked them: “Why can’t technology solve our problem?”
This newly formed team, including Samuel, Joshwa Benkya and Norman Acidri, came up with a twofold plan:
The idea came together right as Developer Student Clubs launched its first Solution Challenge, an open call for all members to submit projects they recently imagined. These young developers had to give it a shot. They created a model, filled out an application, and pitched the idea. After waiting a month, they heard back - their team won the competition! Their idea was selected from a pool of 170 applicants across India, Africa, and Indonesia. In other words, everything was about to change.
In a country where talent can go unnoticed and problems often go unsolved, this new team had pushed through the odds. Developer Student Clubs is a platform for these types of bold thinkers. Students who view the issues of their region not simply as obstacles to overcome, but chances to mend their home, build a better life for themselves, and transform the experiences of their people.
The goal of the Solution Challenge, and all other DSC programs, is to educate young developers early and equip them with the right skills to make an impact in their community.
In this case, office space in Uganda was expensive and hard to find. Samuel’s team previously had few chances to all work under the same roof. After winning the challenge, Developer Student Clubs helped them find a physical space of their own to come together and collaborate - a simple tool, but one that led to a turning point. As Samuel described it,
With this new space to work, DSC then brought some of Africa’s best Google Developer Group Leads directly to the young developers. In these meetings, the students were given high-level insights on how to best leverage Android, Firebase, and Presto to propel their product forward. As Samuel put it:
As a result, the team realized that with the scarcity of internet in Uganda, Firebase was the perfect technology to build with - allowing healthcare workers to use the app offline but “check in” and receive updates when they were able to find internet.
Although the app has made impressive strides since winning the competition, this young team knows they can make it even better. They want to improve its usability by implementing more visuals and are working to create a version for parents, so families can track the status of their child’s vaccination on their own.
While there is plenty of work ahead, with these gifted students and Developer Student Clubs taking each step forward together, any challenge seems solvable.
What has the team been up to recently? From August 5th-9th they attended the Startup Africa Roadtrip, an intensive training week on how best to refine a startup business model.
Today we want to walk through some updated analysis on the benefit that prerendering can provide on load times. AMP is designed to reduce page load time, and one of the most important ways Google Search reduces page load time is through privacy-preserving-prerendering AMP documents before a link is clicked.
The AMP framework has been designed to understand the layout of all page content and the loading status of all resources, so it can determine the time when all "above the fold" content has loaded. It also knows when the document is prerendered and when it is displayed. Thus, the AMP framework can compute the time from click until the above the fold content is displayed. AMP measures page load speed with a custom metric called First Viewport Ready (FVR). This is defined as the point in time "when non-ad resources above the fold fired their load event measured from the time the user clicks (So takes pre-rendering into account)". If an AMP document is fully prerendered this metric will be 0. If prerendering was not complete at the time of click or if the document was not prerendered at all, then the metric will be greater than 0.
Google Search prerenders some AMP documents and not others so we are able to see the impact that prerendering has on FVR. The chart below shows percentiles for FVR with and without prerendering. FVR is 0 when the AMP framework successfully completes prerendering before the document is displayed.
First Contentful Paint (FCP) is a page load speed metric that is measured by the browser. It is available for all documents, not just AMP documents. FCP is the point in time when the first bit of content from the DOM is rendered. A high value for FCP indicates that a page is definitely slow, but a low value for FCP does not necessarily mean that a page loads quickly since the first bit may not be important content. This is useful, but since AMP has a better understanding of what content is visible, FVR gives a better understanding of when content becomes visible.
FCP is not aware of prerendering so AMP computes a prerender sensitive derivative metric, Prerender-adjusted First Contentful Paint (PFCP), that subtracts out the time before click. When not prerendered, PFCP = FCP. FCP also decreases with prerendering, but the difference is less dramatic than FVR.
It may be surprising that median prerendered PFCP is higher than median prerendered FVR. This happens because the browser has to draw the content to the screen after the click. PFCP includes that time, while FVR does not.
Prerendering AMP documents leads to substantial improvements in page load times. Page load time can be measured in different ways, but they consistently show that prerendering lets users see the content they want faster. For now, only AMP can provide the privacy preserving prerendering needed for this speed benefit. In the future, new web platform features, such as Signed Exchanges, will bring privacy-preserving instant loading to non-AMP documents too.
Posted by Vikram Tank (Product Manager), Coral Team
Coral’s had a busy summer working with customers, expanding distribution, and building new features — and of course taking some time for R&R. We’re excited to share updates, early work, and new models for our platform for local AI with you.
The compiler has been updated to version 2.0, adding support for models built using post-training quantization—only when using full integer quantization (previously, we required quantization-aware training)—and fixing a few bugs. As the Tensorflow team mentions in their Medium post “post-training integer quantization enables users to take an already-trained floating-point model and fully quantize it to only use 8-bit signed integers (i.e. `int8`).” In addition to reducing the model size, models that are quantized with this method can now be accelerated by the Edge TPU found in Coral products.
We've also updated the Edge TPU Python library to version 2.11.1 to include new APIs for transfer learning on Coral products. The new on-device back propagation API allows you to perform transfer learning on the last layer of an image classification model. The last layer of a model is removed before compilation and implemented on-device to run on the CPU. It allows for near-real time transfer learning and doesn’t require you to recompile the model. Our previously released imprinting API, has been updated to allow you to quickly retrain existing classes or add new ones while leaving other classes alone. You can now even keep the classes from the pre-trained base model. Learn more about both options for on-device transfer learning.
Until now, accelerating your model with the Edge TPU required that you write code using either our Edge TPU Python API or in C++. But now you can accelerate your model on the Edge TPU when using the TensorFlow Lite interpreter API, because we've released a TensorFlow Lite delegate for the Edge TPU. The TensorFlow Lite Delegate API is an experimental feature in TensorFlow Lite that allows for the TensorFlow Lite interpreter to delegate part or all of graph execution to another executor—in this case, the other executor is the Edge TPU. Learn more about the TensorFlow Lite delegate for Edge TPU.
Coral has also been working with Edge TPU and AutoML teams to release EfficientNet-EdgeTPU: a family of image classification models customized to run efficiently on the Edge TPU. The models are based upon the EfficientNet architecture to achieve the image classification accuracy of a server-side model in a compact size that's optimized for low latency on the Edge TPU. You can read more about the models’ development and performance on the Google AI Blog, and download trained and compiled versions on the Coral Models page.
And, as summer comes to an end we also want to share that Arrow offers a student teacher discount for those looking to experiment with the boards in class or the lab this year.
We're excited to keep evolving the Coral platform, please keep sending us feedback at coral-support@google.com.