Steve Thomas - IT Consultant

Who helped your startup grow? Nominate a growth marketing agency.

For those who have been members of Extra Crunch for a while now, you have seen us go through two cycles of Verified Experts, covering startup attorneys and brand designers. Now, we turn our attention to our third community of startup professionals: growth marketers / hackers. Growth is the single most important objective of any startup, and so these professionals can have an outsized impact on which companies grow, and by how much.

Yvonne Leow is leading our search for the best growth marketers out there, as rated by founders. Worked with someone yourself? Impressed by the growth of a friend’s startup? Let us know with this handy form and get ready for new profiles in the coming weeks.

The Exit: Getaround’s $300M roadtrip

Our SF-based reporter Lucas Matney published his second piece in his “The Exit” series on, well, startup exits. Last time, Lucas looked at the acquisition of Dynamic Yield for $300 million by McDonald’s, and now he looks at Getaround’s $300 million acquisition of Paris-based Drivy (why Lucas loves $300m transactions, I have no idea). He interviewed Jeremy Uzan of Alven Capital to get the details:

It’s the best and worst time to be in semiconductors right now. Silicon Valley investors are once again owning up to their namesakes and taking a deep interest in next-generation silicon, with leading lights like Graphcore in the United Kingdom hitting unicorn status while weirdly named and stealthy startups like Groq in the Bay Area grow up.

Growth in chips capable of processing artificial intelligence workflows is expected to swell phenomenally over the coming years. As Asa Fitch at the Wall Street Journal noted yesterday, “Demand for chips specialized for AI is growing at such a pace the industry can barely keep up. Sales of such chips are expected to double this year to around $8 billion and reach more than $34 billion by 2023, according to Gartner projections.”

Yet, all those rosy projections don’t suddenly make the financial results of companies like Nvidia any easier to swallow. The company reported its quarterly earnings last week, and the results were weak — pretty much across the board.

As its once-strong mobile division continues to slide, LG is picking up its focus on emerging tech. The company has pushed automotive, and particularly its self-driving capabilities, and today it doubled down on its smart home play with the announcement of its own artificial intelligence (AI) chip.

LG said the new chip includes its own neural engine that will improve the deep-learning algorithms used in its future smart home devices, which will include robot vacuum cleaners, washing machines, refrigerators and air conditioners. The chip can operate without an internet connection thanks to on-device processing, and it uses “a separate hardware-implemented security zone” to store personal data.

“The AI Chip incorporates visual intelligence to better recognize and distinguish space, location, objects and users while voice intelligence accurately recognizes voice and noise characteristics while product intelligence enhances the capabilities of the device by detecting physical and chemical changes in the environment,” the company wrote in an announcement.

To date, companies seeking AI or machine learning (ML) smarts at chipset level have turned to established names like Intel, ARM and Nvidia, with upstarts including Graphcore, Cerebras and Wave Computing provided VC-fueled alternatives.

There is, indeed, a boom in AI and ML challengers. A New York Times report published last year estimated that “at least 45 startups are working on chips that can power tasks like speech and self-driving cars,” but that doesn’t include many under-the-radar projects financed by the Chinese government.

LG isn’t alone in opting to fly solo in AI. Facebook, Amazon and Apple are all reported to be working on AI and ML chipsets for specific purposes. In LG’s case, its solution is customized for smarter home devices.

“Our AI C​hip is designed to provide optimized artificial intelligence solutions for future LG products. This will further enhance the three key pillars of our artificial intelligence strategy – evolve, connect and open – and provide customers with an improved experience for a better life,” IP Park, president and CTO of LG Electronics, said in a statement.

The company’s home appliance unit just recorded its highest quarter of sales and profit to date. Despite a sluggish mobile division, LG posted an annual profit of $2.4 billion last year with standout results for its home appliance and home entertainment units — two core areas of focus for AI.

The enterprise software and services focused accelerator, Alchemist has raised $4 million in fresh financing from investors BASF and the Qatar Development Bank, just in time for its latest demo day unveiling 20 new companies.

Qatar and BASF join previous investors including the venture firms Mayfield, Khosla Ventures, Foundation Capital, DFJ, and USVP, and corporate investors like Cisco, Siemens and Juniper Networks.

While the roster of successes from Alchemist’s fund isn’t as lengthy as Y Combinator, the accelerator program has launched the likes of the quantum computing upstart, Rigetti, the soft-launch developer tool LaunchDarkly, and drone startup Matternet .

Some (personal) highlights of the latest cohort include:

  • Bayware: Helmed by a former head of software defined networking from Cisco, the company is pitching a tool that makes creating networks in multi-cloud environments as easy as copying and pasting.
  • MotorCortex.AI: Co-founded by a Stanford Engineering professor and a Carnegie Mellon roboticist, the company is using computer vision, machine learning, and robotics to create a fruit packer for packaging lines. Starting with avocados, the company is aiming to tackle the entire packaging side of pick and pack in logistics.
  • Resilio: With claims of a 96% effectiveness rate and $35,000 in annual recurring revenue with another $1 million in the pipeline, Resilio is already seeing companies embrace its mobile app that uses a phone’s camera to track stress levels and application-based prompts on how to lower it, according to Alchemist.
  • Operant Networks: It’s a long held belief (of mine) that if computing networks are already irrevocably compromised the best thing that companies and individuals can do is just encrypt the hell out of their data. Apparently Operant agrees with me.  The company is claiming 50% time savings with this approach, and have booked $1.9m in 2019 as proof, according to Alchemist.
  • HPC Hub: HPC Hub wants to  democratize access to supercomputers by overlaying a virtualization layer and pre-installed software on underutilized super computers to give more companies and researchers easier access to machines… and they’ve booked $92,000 worth of annual recurring revenue.
  • DinoPlusAI: This chip developer is designing a low latency chip for artificial intelligence applications, reducing latency by 12 times over a competing Nvidia chip, according to the company. DinoPlusAI sees applications for its tech in things like real-time AI markets and autonomous driving. Its team is led by a designer from Cadence and Broadcom and the company already has $8 million in letters of intent signed, according to Alchemist.
  • Aero Systems West Co-founders from the Air Force’s Research Labs and MIT are aiming to take humans out of drone operations and maintenance. The company contends that for every hour of flight time, drones require 7 hours of maintenance and check ups. Aero Systems aims to reduce that by using remote analytics, self-inspection, autonomous deployment, and automated maintenance to take humans out of the drone business.

Watch a livestream of Alchemist’s demo day pitches, starting at 3PM, here.

 

Hailo, a Tel Aviv-based AI chipmaker, today announced that it is now sampling its Hailo -8 chips, the first of its deep learning processors. The new chip promises up to 26 tera operations per second (TOPS) and the company is now testing it with a number of select customers, mostly in the automotive industry.

Hailo first appeared on the radar last year, when it raised a $12.5 million Series A round. At the time, the company was still waiting for the first samples of its chips. Now, the company says that the Hailo-8 will outperform all other edge processors and do so at a smaller size and with fewer memory requirements. “By designing an architecture that relies on the core properties of neural networks, edge devices can now run deep learning applications at full scale more efficiently, effectively, and sustainably than traditional solutions, while significantly lowering costs,” the company explains.

The company also argues that its chip outperforms Nvidia’s comparable Javier Xavier AGX in some benchmarks, all while using less power and hence running cooler — something that’s especially important in small IoT devices.

We’ll have to see if that works out in practice once more engineers get their hands on these chips, of course, but there can be no doubt that the demand for AI chips on the edge continues to increase. A few years ago, after all, the market shifted away from a focus on centralizing all processing in the cloud to moving to the edge, in an effort to improve latency, reduce bandwidth cost and provide a more stable platform that doesn’t depend on network connectivity.

Like Mobileye before it (which was later acquired by Intel), Hailo is working with OEMs and tier-1 suppliers in the automotive industry to bring its chip to market, but it’s also looking at other verticals, including smart home products and really any industry where a high-performance AI chip is needed for object detection and segmentation, for example.

“In recent years, we’ve witnessed an ever-growing list of applications unlocked by deep learning, which were made possible thanks to server-class GPUs,” said Orr Danon, CEO of Hailo. “However, as industries are increasingly powered and even upended by AI, there is a crucial need for an analogous architecture that replaces processors of the past, enabling deep learning to run devices at the edge. Hailo’s chip was designed from the ground up to do just that.”

Two futuristic projects are coming together to help increase global internet access after Loon, the Google spinout that uses a collection of floating balloons to bring connectivity to remote areas, announced it has raised money from a SoftBank initiative.

HAPSMobile, a SoftBank project that is also focused on increasing global connectivity, is investing $125 million into Loon, according to an announcement from SoftBank made this morning. The agreement includes an option for Loon to make a reciprocal $125 million investment in HAPSMobile and it includes co-operation plans, details of which are below.

HAPSMobile is a one-year-old joint venture between SoftBank and U.S. company AeroVironment . The company has developed a solar-powered drone that’s designed to deliver 5G connectivity in the same way Facebook has tried in the past. The social network canceled its Aquila drone last year, although it is reported to have teamed up with Airbus for new trials in Australia.

Where Facebook has stumbled, HAPSMobile has made promising progress. The company said that its HAWK 30 drone — pictured below in an impression — has completed its initial development and the first trials are reportedly set to begin this year.

Loon, meanwhile, was one of the first projects to go after the idea of air-based connectivity with a launch in 2013. The business was spun out of X, the ‘moonshot’ division of Alphabet, last year and, though it is still a work in progress, it has certainly developed from an initial crazy idea conceived within Google.

Loon played a role in connecting those affected by flooding in Peru in 2017 and it assisted those devastated by Hurricane Maria in Puerto Rico last year. Loon claims its balloons have flown more than 30 million kms and provided internet access for “hundreds of thousands” of people across the world.

In addition to the capital investment, the two companies have announced a set of initiatives that will help them leverage their collective work and technology.

For starters, they say they will make their crafts/balloons open to use for the other — so HAPSMobile can tap Loon balloons for connectivity and vice-versa — while, connected to that, they will jointly develop a communication payload across both services. They also plan to develop a common ground station that could work with each side’s tech and develop shared connectivity that their airborne hardware can tap.

Loon has already developed fleet management technology because of the nature of its service, which is delivered by a collection of balloons, and that will be optimized for HAPSMobile.

The premise of HAPSMobile is very much like Loon

Outside of tech, the duo said they will create an alliance “to promote the use of high altitude communications solution with regulators and officials worldwide.”

The investment is another signal that shows SoftBank’s appetite in tech investing is not limited to up-and-coming startups via its Vision Fund, more established ventures are indeed also in play. Just yesterday, the Vision Fund announced plans to invest $1 billion in German payment firm Wirecard and its past investments include ARM and Nvidia, although SoftBank has sold its stake in the latter.

Another round of followups on Nvidia, and then some short news analysis.

TechCrunch is experimenting with new content forms. This is a rough draft of something new – provide your feedback directly to the author (Danny at danny@techcrunch.com) if you like or hate something here.

Nvidia / TSMC questions

Following up on my analyses this week on Nvidia (Part 1, Part 2) , a reader asked in regards to Nvidia’s risk with China tariffs:

but the TSMC impact w.r.t. tariffs doesn’t make sense to me. TSMC is largely not impacted by tariffs and so the supply chain with NVIDIA is also not impacted w.r.t. to TSMC as a supplier. There are many alternate wafer suppliers in Taiwan.

This is a challenging question to definitively answer, since obviously Nvidia doesn’t publicly disclose its supply chain, or more granularly, which factories those supply chain partners utilize for its production. It does, however, list a number of companies in its 10-K form as manufacturing, testing, and packaging partners, including:

To understand how this all fits together, there are essentially three phases for bringing a semiconductor to market:

  1. Design – this is Nvidia’s core specialty
  2. Manufacturing – actually making the chip from silicon and other materials at the precision required for it to be reliable
  3. Testing, packaging and distribution – once chips are made, they need to be tested to prove that manufacturing worked, then packaged properly to protect them and shipped worldwide to wherever they are going to be assembled/integrated

For the highest precision manufacturing required for chips like Nvidia’s, Taiwan, South Korea and the U.S. are the world leaders, with China trying to catch up through programs like Made in China 2025 (which, after caustic pushback from countries around the world, it looks like Beijing is potentially scrapping this week). China is still considered to be one-to-two generations behind in chip manufacturing, though it increasingly owns the low-end of the market.

Where the semiconductor supply chain traditionally gets more entwined with China is around testing and packaging, which are generally considered lower value (albeit critical) tasks that have been increasingly outsourced to the mainland over the years. Taiwan remains the dominant player here as well, with roughly 50% of the global market, but China has been rapidly expanding.

U.S. tariffs on Chinese goods do not apply to Taiwan, and so for the most part, Nvidia’s supply chain should be adept at avoiding most of the brunt of the trade conflict. And while assembly is heavily based in China, electronics assemblers are rapidly adapting their supply chains to mitigate the damage of tariffs by moving factories to Vietnam, India, and elsewhere.

Where it gets tricky is the Chinese market itself, which imports a huge number of semiconductor chips, and represents roughly 20% of Nvidia’s revenues. Even here, many analysts believe that the Chinese will have no choice but to buy Nvidia’s chips, since they are market-leading and substitutes are not easily available.

So the conclusion is that Nvidia likely has maneuvering room in the short-term to weather exogenous trade tariff shocks and mitigate their damage. Medium to long-term though, the company will have to strategically position itself very carefully, since China is quickly becoming a dominant player in exactly the verticals it wants to own (automotive, ML workflows, etc.). In other words, Nvidia needs the Chinese market for growth at the exact moment that door is slamming shut. How it navigates this challenge in the years ahead will determine much of its growth profile in the years ahead.

Rapid fire analysis

Short summaries and analysis of important news stories

Saudi Arabia’s Crown Prince Mohammed bin Salman. FETHI BELAID/AFP/Getty Images

US intelligence community says quantum computing and AI pose an ’emerging threat’ to national security – Our very own Zack Whittaker talks about future challenges to U.S. national security. These technologies are “dual-use,” which means that they can be used for good purposes (autonomous driving, faster processing) and also for nefarious purposes (breaking encryption, autonomous warfare). Expect huge debates and challenges in the next decade about how to keep these technologies on the safe side.

Saudi Arabia Pumps Up Stock Market After Bad News, Including Khashoggi Murder – A WSJ trio of reporters investigates the Saudi government’s aggressive attempts to shore up the value of its stock exchange. Exchange manipulation is hardly novel, either in traditional markets or in blockchain markets. China has been aggressively doing this in its stock exchanges for years. But it is a reminder that in emerging and new exchanges, much of the price signaling is artificial.

A law firm in the trenches against media unions – Andrew McCormick writes in the Columbia Journalism Review how law firm Jones Day has taken a leading role in fighting against the unionization of newsrooms. The challenge of course is that the media business remains mired in cutbacks and weak earnings, and so trying to better divide a rapidly shrinking pie doesn’t make a lot of sense to me. The future — in my view — is entrepreneurial journalists backed up by platforms like Substack where they set their own voice, tone, publishing calendar, and benefits. Having a close relationship with readers is the only way forward for job security.

At least 15 central banks are serious about getting into digital currency – Mike Orcutt at MIT Technology Review notes that there are a bunch of central banks, including China and Canada. What’s interesting is that the trends backing this up including financial inclusion and “diminishing cash usage.” Even though blockchain is in a nuclear winter following the collapse of crypto prices this year, it is exactly these sorts of projects that could be the way forward for the industry.

What’s next

More semiconductors probably. And Arman and I are side glancing at Yelp these days. Any thoughts? Email me at danny@techcrunch.com.

This newsletter is written with the assistance of Arman Tabatabai from New York

Yesterday’s analysis of Nvidia’s challneges triggered a surge of mail from readers. The company has lost about half of its value over the past two months, and has mostly blamed a “crypto hangover” for the problem. But as I pointed yesterday, it’s really the three Cs: “Crypto, customers, and China.” There are nuances here worth exploring though.

TechCrunch is experimenting with new content forms. This is a rough draft of something new – provide your feedback directly to the author (Danny at danny@techcrunch.com) if you like or hate something here.

Rapacious capitalists and short-termism

One major vein of reader feedback was around the remarkable short-termism of my analysis, which (mostly) looked at Nvidia over the past 60 days. As a reader named Stephen wrote to me:

By focusing on the peak price from this summer and its fall you ignore the fact that the stock price today is nearly the same as it was in June of 2017. Nvidia was on a huge run because of Bitcoin and the associated run on GPUs by miners. With the crypto currency market in decline so is the demand for advanced GPUs.

There is nothing Nvidia can do about that. They profited greatly from that blip and now they are returning to normal.

That’s entirely fair. After diving in the 2008 financial crisis along with the rest of the market, Nvidia’s market cap steadily gained value for nearly seven years, growing from around $3.6 billion in 2008 to around $15 billion at the end of 2015, far outpacing the S&P 500 or other standard benchmarks.

As the crypto craze took off in 2016 though, that fairly linear growth became exponential. The company hit a peak this past August, reaching $175 billion in value, only to slam back down to Earth with today’s $91.2 billion. So in about three full years — even with the last two month’s 50% drop — the company has managed to grow its market value roughly six times. That’s very strong growth for an established company, even in the technology sector.

The key question though is whether today’s market value is backed by the company’s positioning in the marketplace.

As much as Nvidia has blamed the collapse of crypto prices for its challenging position, that is hardly the whole story. New competition from startups and its own customers are challenging the company on its plan to dominate a series of new workload applications like machine learning and autonomous vehicles.

If Nvidia succeeds, its market cap makes a whole lot of sense. But if it fails to keep a market dominant position in these new applications, then it will have to revert back to its core gamer audience, and today’s market cap makes no sense given the limited size and growth of that market.

China / Nvidia

China remains a major site for manufacturing and assembly. STR/AFP/Getty Images

Another strain of readers asked for more analysis around China tariffs and their potential effect on Nvidia (you short sellers are a fascinating lot).

Let’s be clear on my position: I expect the trade conflict to get worse, not better. There is not a single issue for Trump that has better optics, political positioning, and broad support than improving the status quo around China trade. There is broad bipartisan agreement that the status quo is untenable, and while folks might disagree about specific approaches or tactics, no one thinks that China has played fair in trade for years. Trump can look like a fighter for the American worker while bringing (some) Democrats and most of his entire party on board. It’s a potent issue.

That places Nvidia in a real bind, because China is a critical end market for its products, and its manufacturing is heavily intertwined with Chinese supply chains. As just an example of this, just a few months ago, Nvidia chose TSMC over Samsung in a bidding competition to produce its large GPUs.

As Arman and I have talked with some supply chain folks about tariffs, the general consensus is that low tariffs won’t have much impact, but higher tariffs will force huge changes in the way supply chains are built to counteract those costs. That seems to be the conclusion of Debby Wu at Bloomberg as well within the iPhone supply chain world.

That said, as much as I think there should be caution on this front, Nvidia is in a relatively enviable position. Its contract manufacturers will have to deal with the tariffs directly, but Nvidia can move its manufacturing to wherever it needs to go — Korea, Vietnam, back to the U.S. or wherever. There is of course some time lag, but I would be much more worried about TSMC’s position long-term than Nvidia’s.

Quick Bites

Short summaries and analysis of important news stories, outside our main analysis

SBI Says It Made An Error Allocating Shares in SoftBank IPO – one of the underwriters for SoftBank’s IPO accidentally sent lower share numbers to some buyers, leading to speculation that the company was dealing with a mass selloff. Things seem to be righted, and blockchain enthusiasts will once again get to scream “BLOCKCHAIN” at another financial markets screwup.

The North Face – Cory Arcangel does a great job of decomposing the modern EDM “product” and placing it into today’s context — with some nice connections to our discussion above about Nvidia. “EDM is the perfect reflection of 2018. It is intense, adrenaline-fueled, all-night music made by hyper efficient, work-a-holic, laptop bureaucrats.” Talking about Steve Angello and his rapid series of engagements on the EDM circuit: “Instead, he—his literal, physical self—was being shipped around, with minutes to spare, as part of an intricate just-in-time supply chain. Like Apple’s, this supply chain is also exceedingly light—Angello is the only asset required.” Hat tip to Robert Cottrell at The Browser for this one.

Semiconductor equipment sales forecast: $62B in 2018 a new record – More uplift for 2018, if some challenges in 2019 forecasted. “In 2018, South Korea will remain the largest equipment market for the second year in a row. China will rise in the rankings to claim the second spot for the first time, dislodging Taiwan, which will fall to the third position.” It will be interesting to see how tariffs affect these numbers next year.

What’s next

More semiconductors probably. And Arman and I are side glancing at Yelp these days. Any thoughts? Email me at danny@techcrunch.com.

This newsletter is written with the assistance of Arman Tabatabai from New York

Nvidia is a company that has reached the highest highs and the lowest lows, all in the span of a couple of weeks.

TechCrunch is experimenting with new content forms. This is a rough draft of something new – provide your feedback directly to the author (Danny at danny@techcrunch.com) if you like or hate something here.

Over the past two months, Nvidia’s stock has dropped from a closing price of $289.36 on Oct. 1 to today’s opening of $148.42, a decline of 48.8%.

It takes a lot for a company to lose nearly half its value in such a short period of time, but Nvidia is proving that an otherwise strong technology business can disappear in a blink of an eye. The company faces an almost perfect barrage of headwinds to its core products that is stalling its plans for long-term chip domination.

To step back a bit first though, Nvidia has traditionally made graphical processing units (GPUs) that are excellent at the kinds of parallel computation required for gaming and applications like computer-assisted design (CAD). It’s a durable and repeatable business, and one that Nvidia has a commanding market share in.

Yet, these markets are also fairly narrow, and so Nvidia has endeavored over the past few years to expand its product offerings to encompass new applications like artificial intelligence / machine learning, autonomous automotive, and crypto hashing. These applications all need strong parallelized processing, which Nvidia specializes in.

At least part of that story has worked well. Nvidia’s chips were extremely popular in the crypto run-up over the past few years, causing widespread shortages of the chips (and annoying its core gaming fans in the process).

This was huge for Nvidia. The company had revenues of $1.05 billion for the quarter ending Oct 31, 2013, and $1.31 billion two years later in 2015 — a fairly slow rate of growth as would be expected for a dominant player in a mature market. As the company expanded its horizons though, Nvidia engorged on growth in new applications like crypto, growing to $3.2 billion in revenue in its last reported quarter. As can be expected, the stock soared.

Now, Nvidia’s growth story is being hammered on multiple fronts. First and foremost, the huge sales of its chips into the crypto space have dried up as crypto prices have crashed in recent months. This is a pattern we are seeing with other companies, namely Bitmain, which has made specialized crypto chips a major part of its business but has lost an enormous amount of its momentum in the crypto bust. It announced it was shuttering its Israel office this week.

That bust is obvious in Nvidia’s revenues this year: they are essentially flat for three quarters now, hovering between $3.1 and $3.2 billion. Some have called this Nvidia’s “crypto hangover.” But crypto is just one facet of the challenges that Nvidia faces.

When it comes to owning next-generation application workflows, Nvidia is facing robust competition from startups and established players who want access to this potentially gigantic market. Even its potential customers are competing with it. Facebook is reportedly designing its own chips, Apple has been doing so for years, Google has been in the game a while, and Amazon is getting into the game fast. Nvidia has the know-how to compete, but these companies also understand the nuances of their applications really, really well. It’s a tough market position to be in.

If the challenges around applications weren’t enough, geopolitical tensions are also causing Nvidia serious harm. As Dan Strumpf and Wenxin Fan wrote in the Wall Street Journal two weeks ago in a deep dive, the company is emblematic of the challenge Silicon Valley firms face in the US / China trade standoff:

Nvidia executives are watching the trade fight with growing unease over whether it will curb its access to Chinese customers, according to a person familiar with the matter. Almost 20% of Nvidia’s $9.7 billion in revenue last year came from China. Many of its chips are used there for assembly into other products, and it has invested heavily to tap China’s burgeoning AI industries.

The company also is concerned that deteriorating relations between the world’s two biggest economies are causing Beijing to double down on efforts to reduce reliance on U.S. suppliers of key hardware such as chips by nurturing homegrown competitors, eating into Nvidia’s long-term business.

Crypto, customers, and China. That’s how you lose half your company’s value in two months.

Quick Bites

Hạ Long Bay, Vietnam. Photo by Andrea Schaffer via Flickr used under Creative Commons.

Google ‘studying steps’ to open headquarters in Vietnam in accordance with cybersecurity laws Following the testimony yesterday from Sundar Pichai on Capitol Hill, it’s interesting to see Google reportedly attempting to open this office in Vietnam, where it faces many of the same challenges as its expansion into China. Vietnam, like many other nations around the world, has recently passed a data sovereignty law that requires that local data be stored locally, forcing Google’s hand. China may be the bogeyman du jour, but the market access challenges posed by China are hardly unique.

Japan’s top 3 telcos to exclude Huawei, ZTE network equipment, according to Japanese news reports – Huawei’s bad news continues, this time with Japanese telcos supposedly vowing not to use the company’s equipment. This is something of a major development if it pans out — so far, the blocks on Huawei equipment have originated from the group of five nations known as the Five Eyes, who share intelligence information. Japan is not a member of that network, and could set the tone for other nations in Asia.

Baidu among 80 plus companies found faking corporate informationBaidu was censured for erroneous information in its Chinese corporate filings. That’s bad news for Baidu, which has hit rock bottom in its share price in the past few days, declining from a 52-week high of $284.22 to today’s opening of $180.50.

What’s next

Arman and I are still investigating the next-generation silicon space. Some good conversations the past few days with investors and supply-chain folks to learn more about this space. Nvidia’s analysis above is the tip of the iceberg. Have thoughts? Give me a ring: danny@techcrunch.com.

This newsletter is written with the assistance of Arman Tabatabai from New York

As Roborace accelerates its plans to build an autonomous racing league, the company is finding that its toughest competition are still human drivers.

In this version of the John Henry story, the humans clearly are still winning, but the robots are catching up.

“We’re going to call it a singularity event when an autonomous racing car is faster than any racing driver,” says Lucas Di Grassi, Roborace’s chief executive and one of the world’s best Formula One racecar drivers. “We started the year 20% slower and we are now 6% slower.”

For the company’s long-term vision, the cars need to be better than any human, because part of the company’s pitch is to be the proving ground for autonomous technologies and a platform to put automakers’ best innovations through their paces in extreme conditions.

“We think when the car reaches a level that is better than any human this will create a layer of trust on the roads,” says Di Grassi. 

It’s a vision that has attracted the attention of some of the world’s biggest companies. Earlier this week, Amazon announced its own initiative for autonomous racing cars. And if Amazon is interested, you can be sure other large technology companies are also angling for a pole position in this proving ground for technology’s latest moonshot.

Amazon’s version of autonomous race cars are smaller than Roborace’s full-sized vehicles — and at $399 are far cheaper than the $1 million vehicles that Roborace is planning on putting on tracks.

Beyond the potential corporate competitors, the company’s human competition is more than just a technical obstacle for Roborace. It’s also a critical unknown when it comes to predicting whether anyone actually will want to watch the races.

When asked whether he thinks Roborace can find an audience for races that are divorced of any element of human risk or drama, Di Grassi says “We don’t know.”

To integrate the two worlds of robot racing and human Formula One (or the increasingly popular Formula E series), Roborace has tweaked its competitive model. Earlier this year, the company unveiled a new model of its car that has room for a human driver behind the wheel.

Robocar

Roborace car at Disrupt Berlin 2018

That human driver is critical to Di Grassi’s new vision for how Roborace competition will now work. In the latest iteration of the company’s races, which will see their first flag waved in April or May of 2019, human drivers will play a larger role in the race.

“We are trying to combine humans and computers in a sport,” says Di Grassi. “The races next year will be a combination of drivers racing for the first part of the race and in a pit stop the driver jumps out and the autonomous vehicle will take over. We want to create this reality that the human and the machine are working together for a better outcome.”

Di Grassi hopes that this integration of the human element and autonomy will be enough to attract viewers, but there are other ways that the company plans to bring an audience to the wild world of autonomous robot racing.

“People want to interact,” says Di Grassi. And with the company’s planned robot races, there will be ways for audiences in the stands to shape the course of the race, potentially by throwing augmented reality obstacles onto the track for the autonomous cars to avoid — creating new challenges for technology to be put through its paces.

“We’re going to try and engage and we’re going to try and get different forms of engagement,” Di Grassi says. Including developing an open source platform that would enable viewers to interact with simulated races in virtual reality — encouraging audience participation and competition in virtual racing leagues that could mirror the action among actual racing teams. 

Like traditional Formula One racing, Roborace is serving two audiences. One is the company’s actual customers — the automakers and vendors that are building the software and hardware for electric and autonomous vehicles — and the audience that ideally will be around to see the fruit of all that labor.

Right now, no automakers have signed up as partners, in part, Di Grassi says, because they’re not confident with their technology. “The automakers are afraid because the software is not ready,” says Di Grassi. But the company’s chief executive is undeterred, because of the profusion of technologies required to make autonomous vehicles work. “Autonomous cars are a combination of a lot of different technology segments — sensors, electric motors, batteries. Our customers are sensor processing companies [and] companies like Nvidia, Qualcomm, Intel,” DiGrassi says.

However, at some point Roborace needs that audience so vendors can prove that their technology works, and people can become more comfortable with the safety and capabilities of autonomous vehicles.

“Nobody’s using high precision vehicle model like drifting and sliding and these situations will be very real. There is a whole different segment that we can develop faster in a controlled environment,” says DiGrassi. “The pitch is to compete against each other to develop technology faster and you develop trust among consumers… this will give trust to people to jump into autonomous taxi in the future.”

Shield TV, Nvidia’s streaming media set-top box introduced last year, is going hands-free by way of Alexa. The company announced this morning the launch of a new Amazon Alexa skill which will enable Shield TV owners in the U.S. to navigate their device with voice commands, including those to turn on or off the Shield, adjust the volume, play, pause, fast-forward, or rewind content, or even move around to various sections, like the Home screen or Settings.

With the addition, Shield TV owners can now choose between Alexa or Google Home Assistant devices, in terms of hands-free voice control.

To use Alexa voice commands, you’ll first have to pair an Echo with the Shield from within the Alexa app. Then, you’ll need to enable the new “Nvidia Shield TV” skill from Amazon’s skill store. After accepting the terms, you’ll link the skill to your Nvidia account. The last step is to select the Shield and the Echo devices you’d like to connect to work together.

Once set up, you can say things like “Alexa, turn on Shield,” “Alexa, play (on Shield),” “Alexa, fast-forward 10 minutes (on Shield),” “Alexa go to control settings (on Shield),” and more.

(Following the first command, Alexa will assume subsequent relevant commands are for Shield. Where “on Shield” is noted in parentheses, you can continue controlling the device with just the command.)

The company says the Alexa support will come to other non-U.S. regions over time, and the list of supported commands will grow, as well.

Alongside the launch, Nvidia also announced Dolby Atmos passthrough support for Prime Video, and a promotion that will give customers buying a Shield TV from Amazon or Best Buy a free Echo Dot (3rd Gen). Current owners can buy a discounted Echo Dot.

Alexa already interoperates with a number of set-top boxes, including Amazon’s own Fire TV line (where the Cube recently added in-app navigation, too); plus TiVo’s devices, Dish’s Hopper, Xbox, DirecTV’s set-top box, and others.

Nvidia, together with partners like IBM, HPE, Oracle, Databricks and others, is launching a new open-source platform for data science and machine learning today. Rapids, as the company is calling it, is all about making it easier for large businesses to use the power of GPUs to quickly analyze massive amounts of data and then use that to build machine learning models.

“Businesses are increasingly data-driven,” Nvidia’s VP of Accelerated Computing Ian Buck told me. “They sense the market and the environment and the behavior and operations of their business through the data they’ve collected. We’ve just come through a decade of big data and the output of that data is using analytics and AI. But most it is still using traditional machine learning to recognize complex patterns, detect changes and make predictions that directly impact their bottom line.”

The idea behind Rapids then is to work with the existing popular open-source libraries and platforms that data scientists use today and accelerate them using GPUs. Rapids integrates with these libraries to provide accelerated analytics, machine learning and — in the future — visualization.

Rapids is based on Python, Buck noted; it has interfaces that are similar to Pandas and Scikit, two very popular machine learning and data analysis libraries, and it’s based on Apache Arrow for in-memory database processing. It can scale from a single GPU to multiple notes and IBM notes that the platform can achieve improvements of up to 50x for some specific use cases when compared to running the same algorithms on CPUs (though that’s not all that surprising, given what we’ve seen from other GPU-accelerated workloads in the past).

Buck noted that Rapids is the result of a multi-year effort to develop a rich enough set of libraries and algorithms, get them running well on GPUs and build the relationships with the open-source projects involved.

“It’s designed to accelerate data science end-to-end,” Buck explained. “From the data prep to machine learning and for those who want to take the next step, deep learning. Through Arrow, Spark users can easily move data into the Rapids platform for acceleration.”

Indeed, Spark is surely going to be one of the major use cases here, so it’s no wonder that Databricks, the company founded by the team behind Spark, is one of the early partners.

“We have multiple ongoing projects to integrate Spark better with native accelerators, including Apache Arrow support and GPU scheduling with Project Hydrogen,” said Spark founder Matei Zaharia in today’s announcement. “We believe that RAPIDS is an exciting new opportunity to scale our customers’ data science and AI workloads.”

Nvidia is also working with Anaconda, BlazingDB, PyData, Quansight and scikit-learn, as well as Wes McKinney, the head of Ursa Labs and the creator of Apache Arrow and Pandas.

Another partner is IBM, which plans to bring Rapids support to many of its services and platforms, including its PowerAI tools for running data science and AI workloads on GPU-accelerated Power9 servers, IBM Watson Studio and Watson Machine Learning and the IBM Cloud with its GPU-enabled machines. “At IBM, we’re very interested in anything that enables higher performance, better business outcomes for data science and machine learning — and we think Nvidia has something very unique here,” Rob Thomas, the GM of IBM Analytics told me.

“The main benefit to the community is that through an entirely free and open-source set of libraries that are directly compatible with the existing algorithms and subroutines that their used to — they now get access to GPU-accelerated versions of them,” Buck said. He also stressed that Rapids isn’t trying to compete with existing machine learning solutions. “Part of the reason why Rapids is open source is so that you can easily incorporate those machine learning subroutines into their software and get the benefits of it.”