Skyline AI raises $3M from Sequoia Capital to help real estate investors make better decisions

Skyline AI, an Israeli startup that uses machine learning to help real estate investors identify promising properties, announced today that it has raised $3 million in seed funding from Sequoia Capital. The round will be used to build its tech platform and hire experts in data science and machine learning.

Founded in 2017 and headquartered in Tel Aviv, Skyline AI predicts future property values and also analyzes the real estate market to help investors make important decisions such as when to raise rents, renovate or sell. Co-founder and chief executive officer Guy Zipori told TechCrunch that Skyline AI’s founding team (who also includes chief technology officer Or Hiltch, chief revenue officer Iri Amirav and executive chairman Amir Leitersdorf) worked together for years at various artificial intelligence-based startups in sectors including security, healthcare and online video. After several of their companies exited, the four were in a position to find investment opportunities. They wanted to explore commercial real estate, but Zipori “were surprised by how limited the technology is in this space.”

Though more industries are turning to data science and artificial intelligence to save time while making complex decisions, many veteran investors still depend on Excel spreadsheets, outdated market data and their “gut feelings,” added Zipori.

Skyline AI wants to take the guesswork out of investment decisions by training its technology on what it claims is the most comprehensive dataset in the industry, drawing on more than 130 sources and analyzing over 10,000 attributes on each data asset for the last 50 years. Skyline AI’s tech then compiles all information into a data lake and cross-references everything to find discrepancies and figure out what information is the most accurate.

“As a side note, we were surprised to learn that asset data sampled from different sources is often dissimilar, meaning that in some cases decisions regarding large deals were made based on bad data,” Zipori said.

One benefit of Skyline AI’s system is that it is able to consider variables that would be difficult to include in Excel spreadsheets and other traditional methods for aggregating data, which is important in real estate because there are so many factors that can impact a property’s value and impact its rents, occupancy levels, maintenance costs and future worth.

In a statement, Sequoia Capital partner Haim Sadger said “The promise of AI to transform commercial real estate investments cannot be understated. Over the last few years, we’ve seen AI disrupt a number of traditional industries and the real estate market should be no different. The power of Skyline AI technology to understand vast amounts of data that affect real estate transactions, will unlock billions of dollars in untapped value.”

Pandora doubles down on ad tech with acquisition of AdsWizz for $145 million

Pandora announced this morning it’s acquiring digital audio ad technology firm AdsWizz for $145 million, as a combination of at least 50 percent cash, with the remaining paid in either cash or stock at Pandora’s discretion. The company, whose technology will be used to upgrade Pandora’s own ad tech capabilities, will continue as a subsidiary headed by CEO Alexis van de Wyer.

AdsWizz offers an end-to-end technology platform that powers music platforms, podcasts and broadcasting groups. Customers include Cox Media Group, iHeartRadio, TuneIn, Entercom, Omnicom Media Group, Spotify, Deezer, PodcastOne, GroupM, and others.

Its software suite includes a variety of ad technology capabilities, including dynamic ad insertion, advanced programmatic platforms, ad campaign monitoring tools, and more.

Above: AdsWizz’ AudioMatic platform for programmatic buying 

It has also developed a number of new audio formats, like ShakeMe which has users shake their phones to trigger an action while listening to an ad; ads that can target users based on an activity like jogging or a situation like cold weather; ads that can be customized based personalized data; and others.

Pandora says it will leverage the acquisition to capitalize on the growth in digital audio advertising, which is up 42 percent year-over-year, according to the IAB.

We understand that Pandora was interested in where AdsWizz’ roadmap aligned with Pandora’s specifically in the areas of audio monetization and ad-buying capabilities. It believes that by joining forces, it will be able to ship and launch new products faster.

Once integrated with Pandora, advertisers will be able to transact through AdsWizz’s global marketplace across Pandora and other audio publishers, the company says. Pandora says that it will continue to invest in AdsWizz technology that supports its core business and the wider industry – or, in order words, Pandora isn’t ripping away AdsWizz from its competitors in streaming at this time.

“Since I joined Pandora six months ago, I have highlighted ad tech as a key area of investment for us. Today we took an important step to advance that priority and accelerate our product roadmap,“ said Roger Lynch, CEO of Pandora, in a statement. “With our scale in audio advertising and AdsWizz’s tech expertise, we will create the largest digital audio advertising ecosystem, better serving global publishers and advertisers — while improving Pandora’s own monetization capabilities.”

The deal comes at a time when Pandora continues to generate the majority of its revenues from advertising, despite its entry into the subscription business with its own rival to Apple Music and Spotify. In its Q4 2017 earnings, the company reported $97.7 million in subscription revenue that offset a 5 percent year-over-year decline in advertising revenues. Meanwhile, ad revenues clocked in at $297.7 million, down from $313.3 million in the same quarter the year before.

It also follows a rough year for Pandora which saw its original founder and CEO Tim Westergren exit, along with Chief Marketing Officer Nick Bartle and President Mike Herring, as part of a larger exec shakeup.

In addition to AdsWizz CEO van de Wyer, Pandora’s acquisition will add 140 people to the company from across all AdsWizz locations, including its San Mateo headquarters.

Pandora has not been as acquisitive as rival Spotify, having generally purchased businesses that are of strategic importance at the time, instead of smaller teams with interesting technology. It’s best known for its acquisitions of Rdio and Ticketfly in 2015, though the latter was handed off to Eventbrite last year.

“We know that the value we have created for all our stakeholders – brands, publishers and listeners – comes from our ability to create engaging and well-targeted advertising experiences. That will not change,” said van de Wyer, in a post on AdsWizz’ website. “Our focus has always been digital audio, with a unique expertise in innovative monetization solutions. That will not change. What will change is our ability to grow even faster, to develop technology more rapidly, to accelerate our ability to provide solutions that meet the increasingly sophisticated needs of advertisers and digital audiences. And to have an even bigger impact on people’s lives,” he added.

The new acquisition does not impact the first quarter 2018 guidance or the full year 2018 commentary provided at Pandora’s last earnings, the company notes. The transaction is expected to close in the second quarter of 2018, and is subject to regulatory approval.

Mythic nets $40M to create a new breed of efficient AI-focused hardware

Another huge financing round is coming in for an AI company today, this time for a startup called Mythic getting a fresh $40 million as it appears massive deals are closing left and right in the sector.

Mythic particularly focuses on the inference side of AI operations — basically making the calculation on the spot for something based off an extensively trained model. The chips are designed to be low power, small, and achieve the same kind of performance you’d expect from a GPU in terms of the lightning-fast operations that algorithms need to perform to figure out whether or not that thing your car is about to run into is a cat or just some text on the road. SoftBank Ventures led this most-recent round of funding, with a strategic investment also coming from Lockheed Martin Ventures. ARM executive Rene Haas will also be joining the company’s board of directors.

“The key to getting really high performance and really good energy efficiency is to keep everything on the chip,” Henry said. “The minute you have to go outside the chip to memory, you lose all performance and energy. It just goes out the window. Knowing that, we found that you can actually leverage flash memory in a very special way. The limit there is, it’s for inference only, but we’re only going after the inference market — it’s gonna be huge. On top of that, the challenge is getting the processors and memory as close together as possible so you don’t have to move around the data on the chip.”

Mythic, like other startups, is looking to ease the back-and-forth trips to memory on the processors in order to speed things up and lower the power consumption, and CEO Michael Henry says the company has figured out how to essentially do the operations — based in a field of mathematics called linear algebra — on flash memory itself.

Mythic’s approach is designed to be what Henry calls more analog. To visualize how it might work, imagine a set-up in Minecraft, with a number of different strings of blocks leading to an end gate. If you flipped a switch to turn 50 of those strings on with some unit value, leaving the rest off, and joined them at the end and saw the combined final result of the power, you would have completed something similar to an addition operation leading to a sum of 50 units. Mythic’s chips are designed to do something not so dissimilar, finding ways to complete those kinds of analog operations for addition and multiplication in order to handle the computational requirements for an inference operation. The end result, Henry says, consumes less power and dissipates less heat while still getting just enough accuracy to get the right solution (more technically: the calculations are 8-bit results).

After that, the challenge is sticking a layer on top of that to make it look and behave like a normal chip to a developer. The goal is to, like other players in the AI hardware space, just plug into frameworks like TensorFlow. Those frameworks abstract out all the complicated tooling and tuning required for such a specific piece of hardware and make it very approachable and easy for developers to start building machine learning projects. Andrew Feldman, CEO of another AI hardware startup called Cerebras Systems, said at the Goldman Sachs Technology and Internet conference last month that frameworks like TensorFlow had  most of the value Nvidia had building up an ecosystem for developers on its own system.

Henry, too, is a big TensorFlow fan. And for good reason: it’s because of frameworks like TensorFlow that allow next-generation chip ideas to even get off the ground in the first place. These kinds of frameworks, which have become increasingly popular with developers, have abstracted out the complexity of working with specific low-level hardware like a field programmable gate array (FPGA) or a GPU. That’s made building machine learning-based operations much easier for developers and led to an explosion of activity when it comes to machine learning, whether it’s speech or image recognition among a number of other use cases.

“Things like TensorFlow make our lives so much easier,” Henry said. “Once you have a neural network described on TensorFlow, it’s on us to take that and translate that onto our chip. We can abstract that difficulty by having an automatic compiler.”

While many of these companies are talking about getting massive performance gains over a GPU — and, to be sure, Henry hopes that’ll be the case — the near term goal for Mythic is to match the performance of a $1,000 GPU while showing it can take up less space and consume less power. There’s a market for the card that customers can hot swap in right away. Henry says the company is focused on using a PCI-E interface, a very common plug-and-play system, and that’s it.

The challenge for Mythic, however, is going to get into the actual design of some of the hardware that comes out. It’s one thing to sell a bunch of cards that companies can stick into their existing hardware, but it’s another to get embedded into the actual pieces of hardware themselves — which is what’s going to need to happen if it wants to be a true workhorse for devices on the edge, like security cameras or things handling speech recognition. That makes the buying cycle a little more difficult, but at the same time, there will be billions of devices out there that need advanced hardware to power their inference operations.

“If we can sell a PCI card, you buy it and drop it in right away, but those are usually for low-volume, high-selling price products,” Henry said. “The other customers we serve design you into the hardware products. That’s a longer cycle, that can take upwards of a year. For that, typically the volumes are much higher. The nice thing is that you’re really really sticky. If they design you into a product you’re really sticky. We can go after both, we can go after board sales, and then go after design.”

There are probably going to be two big walls to Mythic, much less any of the other players out there. The first is that none of these companies have shipped a product. While Mythic, or other companies, might have a proof-of-concept chip that can drop on the table, getting a production-ready piece of next-generation silicon is a dramatic undertaking. Then there’s the process of not only getting people to buy the hardware, but actually convincing them that they’ll have the systems in place to ensure that developers will build on that hardware. Mythic says it plans to have a sample for customers by the end of the year, with a production product by 2019.

That also explains why Mythic, along with those other startups, are able to raise enormous rounds of money — which means there’s going to be a lot of competition amongst all of them. Here’s a quick list of what fundraising has happened so far: SambaNova Systems raised $56 million last week; Graphcore raised $50 million in November last year; Cerebras Systems’s first round was $25 million in December 2016; and this isn’t even counting an increasing amount of activity happening among companies in China. There’s still definitely a segment of investors that consider the space way too hot (and there is, indeed, a ton of funding) or potentially unnecessary if you don’t need the bleeding edge efficiency or power of these products.

And there are, of course, the elephants in the room in the form of Nvidia and to a lesser extent Intel. The latter is betting big on FPGA and other products, while Nvidia has snapped up most of the market thanks to GPUs being much more efficient at the kind of math needed for AI. The play for all these startups is they can be faster, more efficient, or in the case of Mythic, cheaper than all those other options. It remains to be seen whether they’ll unseat Nvidia, but nonetheless there’s an enormous amount of funding flowing in.

“The question is, is someone going to be able to beat Nvidia when they have the valuation and cash reserves,” Henry said. “But the thing, is we’re in a different market. We’re going after the edge, we’re going after things embedded inside phones and cars and drones and robotics, for applications like AR and VR, and it’s just really a different market. When investors analyze us they have to think of us differently. They don’t think, is this the one that wins Nvidia, they think, are one or more of these powder keg markets explode. It’s a different conversation for us because we’re an edge company.”

Alibaba doubles down on Lazada with fresh $2B investment and new CEO

Alibaba is increasing its control of Lazada, its e-commerce marketplace in Southeast Asia it acquired control of in 2016, after it injected another $2 billion into the business and replaced its CEO with a long-standing Alibaba executive.

Alibaba’s first investment came in April 2016 when it bought 51 percent of Lazada for $1 billion, and it added another $1 billion last summer to increase its equity to around 83 percent. With today’s news, Alibaba has invested $4 billion to date which it said will “accelerate the growth plans” and help further tie the Lazada business into Alibaba’s core e-commerce service.

There’s already been plenty of evidence of increased ties between Alibaba and Lazada. The latter began offering products from Alibaba’s Taobao marketplace across Southeast Asia last year, and Alibaba has replaced Lazada’s tech team leadership with executives of its own. The latest shakeup is the appointment of Lucy Peng as Lazada’s new CEO to replace Max Bittner, who was installed by former owner Rocket Internet back in 2012.

Peng, who is one of Alibaba’s original 12 founders, has been Chairwoman of Lazada and is executive chairman of Ant Financial, Alibaba’s fintech affiliate company. Bittner will remain involved as “senior advisor to Alibaba Group” and apparently involved in future strategy, including further international expansion opportunities.

Lazada has progressed significantly since Alibaba’s first investment — which came at a time when the business had been close to running out of money — but the reality in Southeast Asia is that e-commerce in the region is a loss-making industry with plenty of competition.

Amazon entered the foray last year, but it remains only in Singapore, while Shopee is a two-year-old entrant bankrolled by Sea, formerly Garena, which raised over $1 billion in a U.S. IPO last year.

Alibaba hasn’t just limited its Southeast Asia approach to backing Lazada. The firm also invested $1.1 billion in Tokopedia which competes with Lazada in Indonesia, Southeast Asia’s largest economy and the world’s fourth most populous country.

Qualcomm’s former exec chair will exit after exploring an acquisition bid

There’s a new twist in the BroadQualm saga this afternoon as Qualcomm has said it won’t renominate Paul Jacobs, the former executive chairman of the company, after he notified the board that he decided to explore the possibility of making a proposal to acquire Qualcomm.

The last time we saw such a huge exploration to acquire a company was circa 2013, when Dell initiated a leveraged buyout to take the company private in a deal worth $24.4 billion. This would be of a dramatically larger scale, and there’s a report by the Financial Times that Jacobs approached Softbank as a potential partner in the buyout. Jacobs is the son of Irwin Jacobs, who founded Qualcomm, and rose to run the company as CEO from 2005 to 2014. Successfully completing a buyout of this scale would, as a result, end up keeping the company that his father founded in 1985 in the family.

“I am glad the board is willing to evaluate such a proposal, consistent with its fiduciary duties to shareholders,” Jacobs said in a statement. “It is unfortunate and disappointing they are attempting to remove me from the board at this time.”

All this comes following Broadcom’s decision to drop its plans to try to complete a hostile takeover of Qualcomm, which would consolidate two of the largest semiconductor companies in the world into a single unit. Qualcomm said the board of directors would instead consist of just 10 members.

“Following the withdrawal of Broadcom’s takeover proposal, Qualcomm is focused on executing its business plan and maximizing value for shareholders as an independent company,” the company said in a statement. “There can be no assurance that Dr. Jacobs can or will make a proposal, but, if he does, the Board will of course evaluate it consistent with its fiduciary duties to shareholders.”

Broadcom dropped its attempts after the Trump administration decided to block the deal altogether. The BroadQualm deal fell into purgatory following an investigation by the Committee on Foreign Investment in the United States, or CFIUS, and then eventually led to the administration putting a stop to the deal — and potentially any of that scale — while Broadcom was still based in Singapore. Broadcom had intended to move to the United States, but the timing was such that Qualcomm would end up avoiding Broadcom’s attempts at a hostile takeover.

BroadQualm has been filled with a number of twists and turns, coming to a chaotic head this week with the end of the deal. Qualcomm removed Jacobs from his role as executive chairman and installed an independent director, and then delayed the shareholder meeting that would give Broadcom an opportunity to pick up the votes to take over control of part of Qualcomm’s board of directors. The administration then handed down its judgment, and Qualcomm pushed up its shareholder meeting as a result to ten days following the decision.

“There are real opportunities to accelerate Qualcomm’s innovation success and strengthen its position in the global marketplace,” Jacobs said in the statement. “These opportunities are challenging as a standalone public company, and there are clear merits to exploring a path to take the company private in order to maximize the company’s long-term performance, deliver superior value to all stockholders, and bolster a critical contributor to American technology.”

It’s not clear if Jacobs would be able to piece together the partnerships necessary to complete a buyout of this scale. But it’s easy to read between the lines of Qualcomm’s statement — which, as always, has to say it will fulfill its fiduciary duty to its shareholders. The former CEO and executive chairman has quietly been a curious figure to this whole process, and it looks like the BroadQualm saga is nowhere near done.

Equity podcast: Theranos’s reckoning, BroadQualm’s stunning conclusion and Lyft’s platform ambitions

Hello and welcome back to Equity, TechCrunch’s venture capital-focused podcast where we unpack the numbers behind the headlines.

This week Katie Roof and I were joined by Mayfield Fund’s Navin Chaddha, an investor with early connections with Lyft to talk about, well, Lyft — as well as two bombshell news events in the form of an SEC fine for Theranos and Broadcom’s hostile takeover efforts for Qualcomm hitting the brakes. Alex Wilhelm was not present this week but will join us again soon (we assume he was tending to his Slayer shirt collection).

Starting off with Lyft, there was quite a bit of activity for Uber’s biggest competitor in North America. The ride-sharing startup (can we still call it a startup?) said it would be partnering with Magna to “co-develop” an autonomous driving system. Chaddha talks a bit about how Lyft’s ambitions aren’t to be a vertical business like Uber, but serve as a platform for anyone to plug into. We’ve definitely seen this play out before — just look at what happened with Apple (the closed platform) and Android (the open platform). We dive in to see if Lyft’s ambitions are actually going to pan out as planned. Also, it got $200 million out of the deal.

Next up is Theranos, where the SEC investigation finally came to a head with founder Elizabeth Holmes and former president Ramesh “Sunny” Balwani were formally charged by the SEC for fraud. The SEC says the two raised more than $700 million from investors through an “elaborate, years-long fraud in which they exaggerated or made false statements about the company’s technology, business, and financial performance.” You can find the full story by TechCrunch’s Connie Loizos here, and we got a chance to dig into the implications of what it might mean for how investors scope out potential founders going forward. (Hint: Chaddha says they need to be more careful.)

Finally, BroadQualm is over. After months of hand-wringing over whether or not Broadcom would buy — and then commit a hostile takeover — of the U.S. semiconductor giant, the Trump administration blocked the deal. A cascading series of events associated with the CFIUS, a government body, got it to the point where Broadcom’s aggressive dealmaker Hock Tan dropped plans to go after Qualcomm altogether. The largest deal of all time in tech will, indeed, not be happening (for now), and it has potentially pretty big implications for M&A going forward.

That’s all for this week, we’ll catch you guys next week. Happy March Madness, and may fortune favor* your brackets.

Equity drops every Friday at 6:00 am PT, so subscribe to us on Apple Podcasts, Overcast, Pocketcast, Downcast and all the casts.

assuming you have Duke losing before the elite 8.

The red-hot AI hardware space gets even hotter with $56M for a startup called SambaNova Systems

Another massive financing round for an AI chip company is coming in today, this time for SambaNova Systems — a startup founded by a pair of Stanford professors and a longtime chip company executive — to build out the next generation of hardware to supercharge AI-centric operations.

SambaNova joins an already quite large class of startups looking to attack the problem of making AI operations much more efficient and faster by rethinking the actual substrate where the computations happen. While the GPU has become increasingly popular among developers for its ability to handle the kinds of lightweight mathematics in very speedy fashion necessary for AI operations. Startups like SambaNova look to create a new platform from scratch, all the way down to the hardware, that is optimized exactly for those operations. The hope is that by doing that, it will be able to outclass a GPU in terms of speed, power usage, and even potentially the actual size of the chip. SambaNova today said it has raised a massive $56 million series A financing round led by GV, with participation from Redline Capital and Atlantic Bridge Ventures.

SambaNova is the product of technology from Kunle Olukotun and Chris Ré, two professors at Stanford, and led by former SVP of development Rodrigo Liang, who was also a VP at Sun for almost 8 years. When looking at the landscape, the team at SambaNova looked to work their way backwards, first identifying what operations need to happen more efficiently and then figuring out what kind of hardware needs to be in place in order to make that happen. That boils down to a lot of calculations stemming from a field of mathematics called linear algebra done very, very quickly, but it’s something that existing CPUs aren’t exactly tuned to do. And a common criticism from most of the founders in this space is that Nvidia GPUs, while much more powerful than CPUs when it comes to these operations, are still ripe for disruption.

“You’ve got these huge [computational] demands, but you have the slowing down of Moore’s law,” Olukotun said. “The question is, how do you meet these demands while Moore’s law slows. Fundamentally you have to develop computing that’s more efficient. If you look at the current approaches to improve these applications based on multiple big cores or many small, or even FPGA or GPU, we fundamentally don’t think you can get to the efficiencies you need. You need an approach that’s different in the algorithms you use and the underlying hardware that’s also required. You need a combination of the two in order to achieve the performance and flexibility levels you need in order to move forward.”

While a $56 million funding round for a series A might sound massive, it’s becoming a pretty standard number for startups looking to attack this space, which has an opportunity to beat massive chipmakers and create a new generation of hardware that will be omnipresent among any device that is built around artificial intelligence — whether that’s a chip sitting on an autonomous vehicle doing rapid image processing to potentially even a server within a healthcare organization training models for complex medical problems. Graphcore, another chip startup, got $50 million in funding from Sequoia Capital, while Cerebras Systems also received significant funding from Benchmark Capital.

Olukotun and Liang wouldn’t go into the specifics of the architecture, but they are looking to redo the operational hardware to optimize for the AI-centric frameworks that have become increasingly popular in fields like image and speech recognition. At its core, that involves a lot of rethinking of how interaction with memory occurs and what happens with heat dissipation for the hardware, among other complex problems. Apple, Google with its TPU, and reportedly Amazon have taken an intense interest in this space to design their own hardware that’s optimized for products like Siri or Alexa, which makes sense because dropping that latency to as close to zero as possible with as much accuracy in the end improves the user experience. A great user experience leads to more lock-in for those platforms, and while the larger players may end up making their own hardware, GV’s Dave Munichiello — who is joining the company’s board — says this is basically a validation that everyone else is going to need the technology soon enough.

“Large companies see a need for specialized hardware and infrastructure,” he said. “AI and large-scale data analytics are so essential to providing services the largest companies provide that they’re willing to invest in their own infrastructure, and that tells us more investment is coming. What Amazon and Google and Microsoft and Apple are doing today will be what the rest of the Fortune 100 are investing in in 5 years. I think it just creates a really interesting market and an opportunity to sell a unique product. It just means the market is really large, if you believe in your company’s technical differentiation, you welcome competition.”

There is certainly going to be a lot of competition in this area, and not just from those startups. While SambaNova wants to create a true platform, there are a lot of different interpretations of where it should go — such as whether it should be two separate pieces of hardware that handle either inference or machine training. Intel, too, is betting on an array of products, as well as a technology called Field Programmable Gate Arrays (or FPGA), which would allow for a more modular approach in building hardware specified for AI and are designed to be flexible and change over time. Both Munichiello’s and Olukotun’s arguments are that these require developers who have a special expertise of FPGA, which a sort of niche-within-a-niche that most organizations will probably not have readily available.

Nvidia has been a massive benefactor in the explosion of AI systems, but it clearly exposed a ton of interest in investing in a new breed of silicon. There’s certainly an argument for developer lock-in on Nvidia’s platforms like Cuda. But there are a lot of new frameworks, like TensorFlow, that are creating a layer of abstraction that are increasingly popular with developers. That, too represents an opportunity for both SambaNova and other startups, who can just work to plug into those popular frameworks, Olukotun said. Cerebras Systems CEO Andrew Feldman actually also addressed some of this on stage at the Goldman Sachs Technology and Internet Conference last month.

“Nvidia has spent a long time building an ecosystem around their GPUs, and for the most part, with the combination of TensorFlow, Google has killed most of its value,” Feldman said at the conference. “What TensorFlow does is, it says to researchers and AI professionals, you don’t have to get into the guts of the hardware. You can write at the upper layers and you can write in Python, you can use scripts, you don’t have to worry about what’s happening underneath. Then you can compile it very simply and directly to a CPU, TPU, GPU, to many different hardwares, including ours. If in order to do work you have to be the type of engineer that can do hand-tuned assembly or can live deep in the guts of hardware there will be no adoption… We’ll just take in their TensorFlow, we don’t have to worry about anything else.”

(As an aside, I was once told that Cuda and those other lower-level platforms are really used by AI wonks like Yann LeCun building weird AI stuff in the corners of the Internet.)

There are, also, two big question marks for SambaNova: first, it’s very new, having started in just November while many of these efforts for both startups and larger companies have been years in the making. Munichiello’s answer to this is that the development for those technologies did, indeed, begin a while ago — and that’s not a terrible thing as SambaNova just gets started in the current generation of AI needs. And the second, among some in the valley, is that most of the industry just might not need hardware that’s does these operations in a blazing fast manner. The latter, you might argue, could just be alleviated by the fact that so many of these companies are getting so much funding, with some already reaching close to billion-dollar valuations.

But, in the end, you can now add SambaNova to the list of AI startups that have raised enormous rounds of funding — one that stretches out to include a myriad of companies around the world like Graphcore and Cerebras Systems, as well as a lot of reported activity out of China with companies like Cambricon Technology and Horizon Robotics. This effort does, indeed, require significant investment not only because it’s hardware at its base, but it has to actually convince customers to deploy that hardware and start tapping the platforms it creates, which supporting existing frameworks hopefully alleviates.

“The challenge you see is that the industry, over the last ten years, has underinvested in semiconductor design,” Liang said. “If you look at the innovations at the startup level all the way through big companies, we really haven’t pushed the envelope on semiconductor design. It was very expensive and the returns were not quite as good. Here we are, suddenly you have a need for semiconductor design, and to do low-power design requires a different skillset. If you look at this transition to intelligent software, it’s one of the biggest transitions we’ve seen in this industry in a long time. You’re not accelerating old software, you want to create that platform that’s flexible enough [to optimize these operations] — and you want to think about all the pieces. It’s not just about machine learning.”