Steve Thomas - IT Consultant

After weeks of leaks and hype, Microsoft today officially announced Windows 11, the next version of its desktop operating system. While the company may have once said that Windows 10 was the last version of Windows, forgoing major point launches for a regular cadence of bi-annual upgrades, but it clearly believes that the changes — and especially the redesigned user interface — in this update warrant a new version number.

If you followed along with the development and eventual demise of Windows 10X, Microsoft’s operating system with a simplified user interface for dual- and (eventually) single-screen laptops, a lot of what you’re seeing here will feel familiar, down to the redesigned Start menu. Indeed, if somebody showed you screenshots of Windows 11 and early previews of Windows 10X, you’d have a hard time telling them apart.

Image Credits: Microsoft

As Microsoft Chief Product Officer Panos Panay noted in today’s announcement, the overall idea behind the design is to make you feel “an incredible sense of calm,” but at the same time, the Windows team has also worked to make it a lot faster. Windows Updates, for example, are supposed to be 40 percent faster, but Panay also noted that starting up your machine and even browsing should feel much faster.

Besides the new user interface, which makes copious use of translucency and shadows, one of the core new UI features is what Microsoft calls Snap Layouts, which pops up a small widget when you hover over the icon that maximizes your window to allow you to move the window to any corner, something that previously involved dragging your window to the corner of your screen (which was often hard when you used multiple screens).

Developing…

With the upcoming release of iOS 15 for Apple mobile devices, Apple’s built-in search feature known as Spotlight will become a lot more functional. In what may be one of its bigger updates since it introduced Siri Suggestions, the new version of Spotlight is becoming an alternative to Google for several key queries, including web images and information about actors, musicians, TV shows and movies. It will also now be able to search across your photo library, deliver richer results for contacts, and connect you more directly with apps and the information they contain. It even allows you to install apps from the App Store without leaving Spotlight itself.

Spotlight is also more accessible than ever before.

Years ago, Spotlight moved from its location to the left of the Home screen to become available with a swipe down in the middle of any screen in iOS 7, which helped grow user adoption. Now, it’s available with the same swipe down gesture on the iPhone’s Lock Screen, too.

Apple showed off a few of Spotlight’s improvements during its keynote address at its Worldwide Developer Conference, including the search feature’s new cards for looking up information on actors, movies and shows, as well as musicians. This change alone could redirect a good portion of web searches away from Google or dedicated apps like IMDb.

For years, Google has been offering quick access to common searches through its Knowledge Graph, a knowledge base that allows it to gather information from across sources and then use that to add informational panels above and the side of its standard search results. Panels on actors, musicians, shows and movies are available as part of that effort.

But now, iPhone users can just pull up this info on their home screen.

The new cards include more than the typical Wikipedia bio and background information you may expect — they also showcase links to where you can listen or watch content from the artist or actor or movie or show in question. They include news articles, social media links, official websites, and even direct you to where the searched person or topic may be found inside your own apps. (E.g. a search for “Billie Eilish” may direct you to her tour tickets inside SeatGeek, or a podcast where she’s a guest).

Image Credits: Apple

For web image searches, Spotlight also now allows you to search for people, places, animals, and more from the web — eating into another search vertical Google today provides.

Image Credits: iOS 15 screenshot

Your personal searches have been upgraded with richer results, too, in iOS 15.

When you search for a contact, you’ll be taken to a card that does more than show their name and how to reach them. You’ll also see their current status (thanks to another iOS 15 feature), as well as their location from FindMy, your recent conversations on Messages, your shared photos, calendar appointments, emails, notes, and files. It’s almost like a personal CRM system.

Image Credits: Apple

Personal photo searches have also been improved. Spotlight now uses Siri intelligence to allow you to search your photos by the people, scenes, elements in your photos, as well as by location. And it’s able to leverage the new Live Text feature in iOS 15 to find the text in your photos to return relevant results.

This could make it easier to pull up photos where you’ve screenshot a recipe, a store receipt, or even a handwritten note, Apple said.

Image Credits: Apple

A couple of features related to Spotlight’s integration with apps weren’t mentioned during the keynote.

Spotlight will now display action buttons on the Maps results for businesses that will prompt users to engage with that business’s app. In this case, the feature is leveraging App Clips, which are small parts of a developer’s app that let you quickly perform a task even without downloading or installing the app in question. For example, from Spotlight you may be prompted to pull up a restaurant’s menu, buy tickets, make an appointment, order takeout, join a waitlist, see showtimes, pay for parking, check prices and more.

The feature will require the business to support App Clips in order to work.

Image Credits: iOS 15 screenshot

Another under-the-radar change — but a significant one — is the new ability to install apps from the App Store directly from Spotlight.

This could prompt more app installs, as it reduces the steps from a search to a download, and makes querying the App Store more broadly available across the operating system.

Developers can additionally choose to insert a few lines of code to their app to make data from the app discoverable within Spotlight and customize how it’s presented to users. This means Spotlight can work as a tool for searching content from inside apps — another way Apple is redirecting users away from traditional web searches in favor of apps.

However, unlike Google’s search engine, which relies on crawlers that browse the web to index the data it contains, Spotlight’s in-app search requires developer adoption.

Still, it’s clear Apple sees Spotlight as a potential rival to web search engines, including Google’s.

“Spotlight is the universal place to start all your searches,” said Apple SVP of Software Engineering Craig Federighi during the keynote event.

Spotlight, of course, can’t handle “all” your searches just yet, but it appears to be steadily working towards that goal.

read more about Apple's WWDC 2021 on TechCrunch

Google today launched the second beta of Android 12. The first beta, which launched at Google’s I/O conference in May, introduced us to the first glimpses of Google’s new ‘Material You’ design system, though many of the promised new features and design tweaks weren’t part of this first beta yet. With this new beta, Google is bringing more of these to its testers (you can sign up for the beta here), including its new privacy dashboard that makes it easier for users to see which apps recently used a phone’s microphone, camera and location.

Other new features available in the beta are the addition of microphone and camera indicators that show users if an app is using those, as well as new Quick Setting toggles to disable app access to them. When access is toggled off, apps will receive blank audio and camera feeds. Related to this, Google is also bringing a clipboard read notification to Android that shows readers when an app is reading from the clipboard.

Image Credits: Google

Also new in beta 2 is a new Internet Panel that makes it easier for you to switch between internet providers, wifi networks, etc.

Image Credits: Google

With this release, Google is now one release away from reaching platform stability in August. As the company notes, now would be a good time for developers to finish their compatibility testing and release compatible versions of their apps, SDK and libraries. Given the current monthly release cadence, we’ll likely see a final release of Android 12 in September.

Like before, you’ll need a compatible device to try out the beta. Unlike with some the earlier preview releases, this list includes a lot of non-Google devices, with Sharp joining the beta program today, for example. You can find a full list of supported devices — and instructions for how to get started on non-Google devices — here.

Image Credits: Google

Just after the release of iOS 12 in 2018, Apple introduced its own built-in screen time tracking tools and controls. In then began cracking down on third-party apps that had implemented their own screen time systems, saying they had done so through via technologies that risked user privacy. What wasn’t available at the time? A Screen Time API that would have allowed developers to tap into Apple’s own Screen Time system and build their own experiences that augmented its capabilities. That’s now changed.

At Apple’s Worldwide Developer Conference on Monday, it introduced a new Screen Time API that offers developer access to frameworks that will allow parental control experience that also maintains user privacy.

The company added three new Swift frameworks to the iOS SDK that will allow developers to create apps that help parents manage what a child can do across their devices and ensure those restrictions stay in place.

The apps that use this API will be able to set restrictions like locking accounts in place, preventing password changes, filtering web traffic, and limiting access to applications. These sorts of changes are already available through Apple’s Screen Time system, but developers can now build their own experiences where these features are offered under their own branding and where they can then expand on the functionality provided by Apple’s system.

 

Developers’ apps that take advantage of the API can also be locked in place so it can only be removed from the device with a parent’s approval.

The apps can authenticate the parents and ensure the device they’re managing belongs to a child in the family. Plus, Apple said the way the system will work lets parents choose the apps and websites they want to limit, without compromising user privacy. (The system returns only opaque tokens instead of identifiers for the apps and website URLs, Apple told developers, so the third-parties aren’t gaining access to private user data like app usage and web browsing details. This would prevent a shady company from building a Screen Time app only to collect troves of user data about app usage, for instance.)

The third-party apps can also create unique time windows for different apps or types of activities, and warn the child when time is nearly up. When it registers the time’s up, the app lock down access to websites and apps and perhaps remind the child it’s time to their homework — or whatever other experience the developer has in mind.

And on the flip side, the apps could create incentives for the child to gain screen time access after they complete some other task, like doing homework, reading or chores, or anything else.

Developers could use these features to design new experiences that Apple’s own Screen Time system doesn’t allow for today, by layering their own ideas on top of Apple’s basic set of controls. Parents would likely fork over their cash to make using Screen Time controls easier and more customized to their needs.

Other apps could tie into Screen Time too, outside of the “family” context — like those aimed at mental health and wellbeing, for example.

Of course, developers have been asking for a Screen Time API since the launch of Screen Time itself, but Apple didn’t seem to prioritize its development until the matter of Apple’s removal of rival screen time apps was brought up in an antitrust hearing last year. At the time, Apple CEO Tim Cook defended the company’s decision by explaining that apps had been using MDM (mobile device management) technology, which was designed for managing employee devices in the enterprise, not home use. This, he said, was a privacy risk.

Apple has a session during WWDC that will detail how the new API works, so we expect we’ll learn more soon as the developer info becomes more public.

read more about Apple's WWDC 2021 on TechCrunch

Apple today announced a number of coming changes and improvements to the App Store that will help developers better target their apps to users, get their apps discovered by more people, and even highlight what sort of events are taking place inside their apps to entice new users to download the app and encourage existing users to return.

The company said its App Store today sees 600 million weekly users across 175 countries, and has paid out over $230 billion to developers since the App Store launched, highlighting the business opportunity for app developers.

However, as the App Store has grown, it’s become harder for app developers to market their apps to new users or get their apps found. The new features aim to address that.

Image Credits: Apple

One change involves the app’s product page. Starting this year, app developers will be able to create multiple custom product pages to showcase different features of their app for different users. For instance, they’ll be able to try out things like different screenshots, videos, and even different app icons to A/B test what users like the most.

They’ll also be able to advertise the dynamic things that are taking place inside their apps on an ongoing basis. Apple explained that apps and games are constantly rolling out new content and limited time events like film premieres on streaming services, events like Pokémon Go fests, or Nike fitness challenges. But these events were often only discoverable by those who already had the app installed and then opted in to push notifications.

Image Credits: Apple

Apple will now allow developers to better advertise these events, with the launch in-app events “front and center on the App Store.” The events can be showcased on the app’s product page. Users can learn more about the events, sign up to be notified, or quickly join the event, if it’s happening now. They can also discover events with personalized recommendations and through App Store search.

App Store editors will curate the best events and the new App Store widget will feature upcoming events right on users’ homescreens, too.

Apple says the feature will be open to all developers, including those who already run events and those who are just getting started.

read more about Apple's WWDC 2021 on TechCrunch

Among many updates coming to iOS 15, Apple Maps will receive a number of upgrades that will bring more detailed maps, improvements for transit riders, AR experiences and other changes to the platform. The improvements build on the new map Apple begin rolling out two years ago, which had focused on offering richer details, and — in response to user feedback and complaints — more accurate navigation.

Since then, Apple Maps has steadily improved.

The new map experience has since launched in the U.S., U.K., Ireland and Canada and will now make its way to Spain and Portugal, starting today. I will then arrive in Italy and Australia later this year, Apple announced during its keynote address during its Worldwide Developer Conference on Monday.

maps driving

Image Credits: Apple

In addition, Apple said iOS 15 Maps will include new details for commercial districts, marinas, buildings, and more. Plus, Apple has added things like elevation, new road colors and labels, as well as hundreds of custom designed landmarks — for example, for places like the Golden Gate Bridge.

Apple also built a new nighttime mode for Maps with a “moonlit glow,” it said.

 

For drivers, Apple added new road details to the map, so it can help drivers as they move throughout a city to better see and understand important things like turn lanes, medians, bus and taxi lanes, and other things. The changes are competitive with some of the updates Google has been making as of late to its own Google Maps platform, which brought street-level details in select cities. These allowed people — including those navigating on foot, in a wheelchair, on a bike, or on a scooter, for example — to better see things like sidewalks and intersections.

Apple is now catching up, saying it, too, will show features like crosswalks and bike lanes.

It will also render things like overlapping complex interchanges in 3D space, making it easier to see upcoming traffic conditions or what lane to take. These features will come to CarPlay later in the year.

Image Credits: Apple

For transit riders, meanwhile, Maps has made improvements to help users find nearby stations.

Users can now pin their favorite lines to the top, and even keep track on their Apple Watch so they don’t have to pull out their phone. The updated Maps app will automatically follow your transit route and notify you when it’s time to disembark, making the app more competitive to third-party apps often favored by transit takers, like Citymapper, for instance.

maps train stop

Image Credits: Apple

When you exit your station, you can also now hold up your iPhone to scan the buildings in the area and Maps will generate an accurate position, offering direction in augmented reality. This is similar to the Live View AR directions Google announced last year.

This feature is launching in select cities in 2021 with more to come in the year ahead, Apple said.

Image Credits: Apple

 

read more about Apple's WWDC 2021 on TechCrunch

Ads are coming to Twitter’s version of Stories, known as Fleets. The company announced today it will began pilot testing Fleet ads in the U.S., which will bring full-screen, vertical format ads to Twitter for the first time, allowing it to better compete with the vertical ads offered across social media platforms, including Facebook, Instagram, Snapchat and TikTok, among others.

The new Fleet ads will appear in between Fleets from people you follow and will support both images and video in 9:16 format. The video ads support up to 30-seconds of content, and brands can also include a “swipe up” call-to-action within their ads.

For video, this is shorter than what Instagram offers (up to 120 seconds) or TikTok (up to 60 seconds), but is in line with best practices which stress that shorter ads can be better.

Twitter didn’t say how often you’ll see a Fleet ad as you swipe, saying only that it will “innovate, test and continue to adapt” in this area, as it learns how people engage.

Advertisers, meanwhile, will receive standard Twitter ad metrics for their Fleet ads, including impressions, profile visits, clicks, website visits, and more. And for video ads, Twitter will report video views, 6s video views, starts, completes, quartile reporting and other metrics.

Image Credits: Twitter

The company is launching the pilot program in the U.S. with just 10 advertisers, including those in tech, retail, dining and CPG verticals.

Twitter says the pilot will help the company to understand how well these types of ads perform on Twitter, which will inform the company not only how to better optimize Fleet ads going forward, but also other areas where it may launch full-screen ads further down the road. In addition, it wants to learn how people feel about and engage with full-screen ads, as the test continues.

Twitter had first begun experimenting with Fleets in spring 2020 as a way to offer a Stories-like product experience where users could post ephemeral content. At the time, the company hoped Fleets would encourage more hesitant users to share content to the platform, as Fleets disappeared after 24 hours, reducing the pressure to perform that comes with posting directly. They also don’t circulate Twitter like retweets and quote tweets do, nor do they show up in Search or Moments.

Image Credits: Twitter

The feature rolled out to global users in November 2020. They were initially criticized by some who felt that Fleets were yet another example of how all social apps were starting to look the same. Nevertheless, Fleets have now become a core part of the Twitter experience.

Today, people use Fleets to point to other tweets they’ve posted, or to share personal updates, photos, and commentary. However, unlike Stories on other platforms, like Snapchat or Instagram, Fleets still offer a fairly bare bones experience in terms of creator tools. You can change the background color, add stickers and text, but that’s about it.

Twitter declined to say how many or what percentage of Twitter’s active user base has now adopted Fleets, noting instead that 73% of those who post Fleets say they also browse what others are sharing. The company says it plans to roll out new updates and features to Fleets in the future, as it continues to invest in the product.

Fleet ads will launch today in the U.S. across both iOS and Android.

At its annual Build conference today, Microsoft announced a couple of new features for version 91 of its Edge browser that, like so much at Build this year, aren’t earth-shattering (developer velocity!) but nice quality-of-life upgrades for its users. Since Microsoft develops Edge in the open, these may also feel familiar to those who keep a close eye on the Edge roadmap – indeed, I think I’ve seen most of these in Edge 90 already…

One new feature is Startup Boost, which allows Edge to start up almost instantly. The way Microsoft does this is pretty straightforward. It simply loads some of the core Edge processes whenever you boot up your Windows machine, so when you task Edge with starting up, there isn’t all that much work left to do. This shouldn’t have too much of an effect on your Windows 10 bootup time, so it’s probably a trade-off worth making, but I also can’t recall anybody complaining about browser startup times in the last couple of years either.

The other new feature is ‘sleeping tabs,’ which does pretty much what you expect it to do. It puts your tabs to sleep so they don’t use up unnecessary memory and CPU cycles.

Microsoft first announced that it was testing this feature back in December and at the time, the Edge team said that it reduces memory usage by 32% and helps improve battery life as well, given that sleeping tabs use 37% less CPU on average compared to non-sleeping tabs.

It’s worth noting that Google’s Chrome browser, which shares many of its underlying technology with Edge, also features tools to limit resource usage, including what Google calls ‘tab freezing,’ as does virtually every other major browser today.

read

There may be billions of IoT devices in use today, but the tooling around building (and updating) the software for them still leaves a lot to be desired. Esper, which today announced that it has raised a $30 million Series B round, builds the tools to enable developers and engineers to deploy and manage fleets of Android-based edge devices. The round was led by Scale Venture Partners, with participation from Madrona Venture Group, Root Ventures, Ubiquity Ventures and Haystack.

The company argues that there are thousands of device manufacturers who are building these kinds of devices on Android alone, but that scaling and managing these deployments comes with a lot of challenges. The core idea here is that Esper brings to device development the DevOps experience that software developers now expect. The company argues that its tools allow companies to forgo building their own internal DevOps teams and instead use its tooling to scale their Android-based IoT fleets for use cases that range from digital signage and kiosks to custom solutions in healthcare, retail, logistics and more.

“The pandemic has transformed industries like connected fitness, digital health, hospitality, and food delivery, further accelerating the adoption of intelligent edge devices. But with each new use case, better software automation is required,” said Yadhu Gopalan, CEO and co-founder at Esper. “Esper’s mature cloud infrastructure incorporates the functionality cloud developers have come to expect, re-imagined for devices.”

Image Credits: Esper

Mobile device management (MDM) isn’t exactly a new thing, but the Esper team argues that these tools weren’t created for this kind of use case. “MDMs are the solution now in the market. They are made for devices being brought into an environment,” Gopalan said. “The DNA of these solutions is rooted in protecting the enterprise and to deploy applications to them in the network. Our customers are sending devices out into the wild. It’s an entirely different use case and model.”

To address these challenges, Esper offers a range of tools and services that includes a full development stack for developers, cloud-based services for device management and hardware emulators to get started with building custom devices.

“Esper helped us launch our Fusion-connected fitness offering on three different types of hardware in less than six months,” said Chris Merli, founder at Inspire Fitness. “Their full stack connected fitness Android platform helped us test our application on different hardware platforms, configure all our devices over the cloud, and manage our fleet exactly to our specifications. They gave us speed, Android expertise, and trust that our application would provide a delightful experience for our customers.”

The company also offers solutions for running Android on older x86 Windows devices to extend the life of this hardware, too.

“We spent about a year and a half on building out the infrastructure,” said Gopalan. “Definitely. That’s the hard part and that’s really creating a reliable, robust mechanism where customers can trust that the bits will flow to the devices. And you can also roll back if you need to.”

Esper is working with hardware partners to launch devices that come with built-in Esper-support from the get-go.

Esper says it saw 70x revenue growth in the last year, an 8x growth in paying customers and a 15x growth in devices running Esper. Since we don’t know the baseline, those numbers are meaningless, but the investors clearly believe that Esper is on to something. Current customers include the likes of CloudKitchens, Spire Health, Intelity, Ordermark, Inspire Fitness, RomTech and Uber.

At its I/O developer conference, Google today announced a slew of updates to its Firebase developer platform, which, as the company also announced, now powers over 3 million apps.

There’s a number of major updates here, most of which center around improving existing tools like Firebase Remote Config and Firebase’s monitoring capabilities, but there are also a number of completely new features here as well, including the ability to create Android App Bundles and a new security tool called App Check.

“Helping developers be successful is what makes Firebase successful,” Firebase product manager Kristen Richards told me ahead of today’s announcements. “So we put helpfulness and helping developers at the center of everything that we do.” She noted that during the pandemic, Google saw a lot of people who started to focus on app development — both as learners and as professional developers. But the team also saw a lot of enterprises move to its platform as those companies looked to quickly bring new apps online.

Maybe the marquee Firebase announcement at I/O is the updated Remote Config. That’s always been a very powerful feature that allows developers to make changes to live production apps on the go without having to release a new version of their app. Developers can use this for anything from A/B testing to providing tailored in-app experience to specific user groups.

With this update, Google is introducing updates to the Remote Config console, to make it easier for developers to see how they are using this tool, as well as an updated publish flow and redesigned test results pages for A/B tests.

Image Credits: Google

What’s most important, though, is that Google is taking Remote Config a step further now by launching a new Personalization feature that helps developers automatically optimize the user experience for individual users. “It’s a new feature of [Remote Config] that uses Google’s machine learning to create unique individual app experiences,” Richards explained. “It’s super simple to set up and it automatically creates these personalized experiences that’s tailored to each individual user. Maybe you have something that you would like, which would be something different for me. In that way, we’re able to get a tailored experience, which is really what customers expect nowadays. I think we’re all expecting things to be more personalized than they have in the past.”

Image Credits: Google

Google is also improving a number of Firebase’s analytics and monitoring capabilities, including its Crashlytics service for figuring out app crashes. For game developers, that means improved support for games written with the help of the Unity platform, for example, but for all developers, the fact that Firebase’s Performance Monitoring service now processes data in real time is a major update to having performance data (especially on launch day) arrive with a delay of almost half a day.

Firebase is also now finally adding support for Android App Bundles, Google’s relatively new format for packaging up all of an app’s code and resources, with Google Play optimizing the actual APK with the right resources for the kind of device the app gets installed on. This typically leads to smaller downloads and faster installs.

On the security side, the Firebase team is launching App Check, now available in beta. App Check helps developers guard their apps against outside threats and is meant to automatically block any traffic to online resources like Cloud Storage, Realtime Database and Cloud Functions for Firebase (with others coming soon) that doesn’t provide valid credentials.

Image Credits: Google

The other update worth mentioning here is to Firebase Extensions, which launched a while ago, but which is getting support for a few more extensions today. These are new extensions from Algolia, Mailchimp and MessageBird, that helps bring new features like Algolia’s search capabilities or MessageBird’s communications features directly to the platform. Google itself is also launching a new extension that helps developers detect comments that could be considered “rude, disrespectful, or unreasonable in a way that will make people leave a conversation.”

Flutter, Google’s cross-platform UI toolkit for building mobile and desktop apps, is getting a small but important update at the company’s I/O conference today. Google also announced that Flutter now powers 200,000 apps in the Play Store alone, including popular apps from companies like WeChat, ByteDance, BMW, Grab and DiDi. Indeed, Google notes that 1 in 8 new apps in the Play Store are now Flutter apps.

The launch of Flutter 2.2 follows Google’s rollout of Flutter 2, which first added support for desktop and web apps in March, so it’s no surprise that this is a relatively minor release. In many ways, the update builds on top of the features the company introduced in version 2 and reliability and performance improvements.

Version 2.2 makes null safety the default for new projects, for example, to add protections against null reference exceptions. As for performance, web apps can now use background caching using service workers, for example, while Android apps can use deferred components and iOS apps get support for precompiled shaders to make first runs smoother.

Google also worked on streamlining the overall process of bringing Flutter apps to desktop platforms (Windows, macOS and Linux).

But as Google notes, a lot of the work right now is happening in the ecosystem. Google itself is introducing a new payment plugin for Flutter built in partnership with the Google Pay team and Google’s ads SDK for Flutter is getting support for adaptive banner formats. Meanwhile, Samsung is now porting Flutter to Tizen and Sony is leading an effort to bring it to embedded Linux. Adobe recently announced its XD to Flutter plugin for its design tool and Microsoft today launched the alpha of Flutter support for Universal Windows Platform (UWP) apps for Windows 10 in alpha.

At its I/O developer conference, Google today announced the first beta of the next version of its Android Studio IDE, Arctic Fox. For the most part, the idea here is to bring more of the tooling around building Android apps directly into the IDE.

While there is a lot that’s new in Arctic Fox, maybe the marquee feature of this update is the integration of Jetpack Compose, Google’s toolkit for building modern user interfaces for Android. In Android Studio, developers can now use Compose Preview to create previews of different configurations (think themes and devices) or deploy a preview directly to a device, all while the layout inspector makes it easier for developers to understand how (and why) a layout is rendered the way it is. With Live Updates enabled any change is then also directly streamed to the device.

The team also integrated the Android Accessibility Test Framework directly into Android Studio to help developers find accessibility issues like missing content descriptions or a low contrast in their designs.

Image Credits: Google

Just like with some of the updates to Android itself, the team is also looking at making it easier to develop for a wider range of form factors. To build Wear OS apps, developers previously had to physically connect the watch to their development machine or go through a lot of steps to pair the watch. Now, users can simply pair a watch and phone emulator (or physical phone) with the new Wear OS Pairing feature. All this takes now is a few clicks.

Also new on the Wear OS side is a new heart rate sensor for the Wear OS Emulators in Android Studio, while the Android Automotive emulator gains the ability to replay car sensor data to help those developers with their development and testing workflow.

Android Studio users who work on a Mac will be happy to hear that Google is also launching a first preview of Android Studio for the Apple Silicon (arm64) architecture.

Image Credits: Google