Connect with us

Tech

Google’s ‘quantum supremacy’ usurped by researchers using ordinary supercomputer – TechCrunch

Published

on


Back in 2019, Google proudly announced that they had achieved what quantum computing researchers had sought for years: proof that the esoteric technique could outperform traditional ones. But this demonstration of “quantum supremacy” is being challenged by researchers claiming to have pulled ahead of Google on a relatively normal supercomputer.

To be clear, no one is saying Google lied or misrepresented its work — the painstaking and groundbreaking research that led to the quantum supremacy announcement in 2019 is still hugely important. But if this new paper is correct, the classical vs. quantum computing competition is still anybody’s game.

You can read the full story of how Google took quantum from theory to reality in the original article, but here’s the very short version. Quantum computers like Sycamore are not better than classical computers at anything yet, with the possible exception of one task: simulating a quantum computer.

It sounds like a cop-out, but the point of quantum supremacy is to show the method’s viability by finding even one highly specific and weird task that it can do better than even the fastest supercomputer. Because that gets the quantum foot in the door to expand that library of tasks. Perhaps in the end all tasks will be faster in quantum, but for Google’s purposes in 2019, only one was, and they showed how and why in great detail.

Now, a team at the Chinese Academy of Sciences led by Pan Zhang has published a paper describing a new technique for simulating a quantum computer (specifically, certain noise patterns it puts out) that appears to take a tiny fraction of the time estimated for classical computation to do so in 2019.

Not being a quantum computing expert nor a statistical physics professor myself, I can only give a general idea of the technique Zhang et al used. They cast the problem as a large 3D network of tensors, with the 53 qubits in Sycamore represented by a grid of nodes, extruded out 20 times to represented the 20 cycles the Sycamore gates went through in the simulated process. The mathematical relationships between these tensors (each its own set of interrelated vectors) was then calculated using a cluster of 512 GPUs.

An illustration from Zhang’s paper showing a visual representation of the 3D tensor array they used to simulate Sycamore’s quantum operations.

In Google’s original paper, it was estimated that performing this scale of simulation on the most powerful supercomputer available at the time (Summit at Oak Ridge National Laboratory) would take about 10,000 years — though to be clear, that was their estimate for 54 qubits doing 25 cycles. 53 qubits doing 20 is considerably less complex but would still take on the order of a few years by their estimate.

Zhang’s group claims to have done it in 15 hours. And if they had access to a proper supercomputer like Summit, it might be accomplished in a handful of seconds — faster than Sycamore. Their paper will be published in the journal Physical Review Letters; you can read it here (PDF).

These results have yet to be fully vetted and replicated by those knowledgeable about such things, but there’s no reason to think it’s some kind of error or hoax. Google even admitted that the baton may be passed back and forth a few times before supremacy is firmly established, since it’s incredibly difficult to build and program quantum computers while classical ones and their software are being improved constantly. (Others in the quantum world were skeptical of their claims to begin with, but some are direct competitors.)

As University of Maryland quantum scientist Dominik Hangleiter told Science, this isn’t a black eye for Google or a knockout punch for quantum in general by any means: “The Google experiment did what it was meant to do, start this race.”

Google may well strike back with new claims of its own — it hasn’t been standing still either — and I’ve contacted the company for comment. But the fact that it’s even competitive is good news for everyone involved; this is an exciting area of computing and work like Google’s and Zhang’s continues to raise the bar for everyone.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published.

Tech

South Africa’s DataProphet closes $10M to scale its AI-as-a-service platform for manufacturers – TechCrunch

Published

on


Manufacturing plants or factories take raw material inputs and add value through a sequence of unit processes before shipping a product. Now, this process must follow a recipe. There are a series of instructions for products such as cars; in those instructions, a list of parameter values, specific temperature for iron melting, specific pressure for mold casting… and the list goes on.

These factories, for instance, those in the automotive space, do all of the quality inspections, in-line and end-of-line, to ensure the cars are in good shape; if not, they are scrapped or reworked, becoming lost capacity and effort for the factories. Employees hired to keep these processes in check can make mistakes; thus, such factories also rely on software to evaluate their experiences, change parameters if needed and ensure that the car reaches the end-of-line as high quality as possible.

DataProphet is one such company. The South African firm, founded by Frans Cronje and Daniel Schwartzkopff, provides AI-as-a-service software in the manufacturing sector and is announcing the completion of its $10 million Series A round.

Cronje, the company’s CEO, told TechCrunch on a call that DataProphet’s focus on providing end-to-end prescriptive AI for manufacturing plants to improve their yield started in 2017. The company provides prescriptive advice and suggested changes to manufacturers’ recipes to avoid making the defects that cause their products to be scrapped or reworked. The company said its flagship AI solution, PRESCRIBE, has helped its clients experience a significant and practical impact on the factory floor, reducing the cost of non-quality by an average of 40%.

Manufacturers use DataProphet at different points on their digitization journeys; data collation and centralization are crucial to kickstart them. The first product in DataProphet’s stack, CONNECT, enables manufacturers to augment their data infrastructure and bring data from where they’ve been using it for compliance in the manufacturing space to a point where they can use it for optimization. The company currently ingests about 100 million unique data points daily on its platform. With this data, PRESCRIBE can make informed decisions to reduce defects, scrap, or non-quality processes and improve manufacturers’ yield.

Cronje says DataProphet employs a hands-on approach, where it continuously monitors data streams and pushes advice and feedback to the operating floor, ensuring that its clients follow them. And in cases where clients don’t follow the advice DataProphet provides, the company engages with the customer to understand their concerns.

“Usually, when we talk about reducing defects, scrap or rework by an average, we do a reduction of about 40% when the customer follows our advice,” said Cronje, who has a degree in management consultancy and statistics. “It’s a wonderful application of AI and manufacturing because it’s a deep application of the theory to realize practical, meaningful impact for our customers and their yield.”

The 50-person team serves clients mainly from the automotive, semiconductor, rubber and foundry industries, deploying its solution to manufacturing plants based in Japan, China, India, Europe, South Africa, the U.S and South America. Some of its competitors — which are international, not local — include Braincube and Seebo.

“I think the way we differentiate ourselves is that we approach this from a holistic factory control where implementing our PRESCRIBE solution can enable a customer to realize this full site optimization,” commented Cronje on DataProphet’s unique selling proposition. “And there’s a second aspect: The solution we’ve got to enable customers to realize yield is an end-to-end prescriptive solution. What I mean by that is that it has the capacity to integrate some of the lowest data levels in factories. And we don’t see that in our competitors.” The chief executive also mentioned that, unlike other players, DataProphet doesn’t depend on its clients to have employees with data science capabilities, which defeats the purpose of providing an AI-as-a-service platform that thrives on organizing data infrastructure itself.

Knife Capital led the Series A round. The South African venture capital firm had initially invested in DataProphet in early 2018 via its KNF Ventures Section 12J funding vehicle. This latest round is the first investment made by Knife Fund III, the targeted $50 million fund it launched last year to support the international expansion of its portfolio companies.

“Accelerating the international expansion of DataProphet, given the leading nature of its technology, is exactly the mandate of our new Fund — and it couldn’t be more fitting for our first investment to be a follow-on investment from our existing cohort,” comments Keet van Zyl, co-founder and partner at Knife Capital on the investment.

Other investors in the round include South Africa’s IDC and Norican, one of the world’s largest metal surface preparation and finishing equipment providers. Per a statement, DataProphet says the infused capital will help it invest further in its industrial AI product set while facilitating targeted growth in selected geographies and manufacturing verticals.

“This is where we’ll be applying a lot of this fund: to support international sales,” added Cronje. “And they’ll support functions needed in markets away from the major engineering hub, South Africa. So part of the investment will be used to develop a European sales office and subsequently a U.S.-based sales office to support customers and partners abroad.”



Source link

Continue Reading

Tech

Elon Musk sells nearly $7 billion in Tesla shares – TechCrunch

Published

on


Tesla CEO Elon Musk is at it again selling shares of his electric vehicle company, per regulatory filings. Since Friday, the executive has sold 7.9 million shares, which totals about $6.9 billion. This is the first time Musk has sold shares in Tesla since April, when he disposed of 9.6 million shares, worth about $8.5 billion.

Musk appears to be selling the shares to stock up on cash in case he’s forced to go through on his $44 billion Twitter acquisition. The executive tweeted Tuesday evening that he was done selling for the moment.

“In the (hopefully unlikely) event that Twitter forces this deal to close and some equity partners don’t come through, it is important to avoid an emergency sale of Tesla stock,” tweeted Musk.

Last month, Musk told Twitter he’s killing the deal because he believed the social media company to be misleading in its bot calculations. However, over the weekend, the executive waffled a bit, tweeting: “If Twitter simply provides their method of sampling 100 accounts and how they’re confirmed to be real, the deal should proceed on original terms. However, if it turns out that their SEC filings are materially false, then it should not.”

Musk also tweeted Tuesday evening that if the Twitter deal doesn’t close, he’ll buy back his shares. Perhaps he’ll wait until Tesla issues its three-to-one stock split, which Tesla shareholders approved last week, so he can buy them back on the cheap.

Over the last ten months, Musk has sold around $32 billion worth of stock in Tesla.

Tesla shares were down 2.44% today but are trading relatively flat in after-hours, suggesting the stock sales are yet to have an effect on Tesla’s share price. Tesla’s stock took a hit late last year when Musk sold off more than $16 billion worth of sales after polling his Twitter fans on whether he should trim his stake, a move that got him in hot water with the Securities and Exchange Commission.

This article has been updated with confirmation from Elon Musk that the stock sales are related to his Twitter acquisition.





Source link

Continue Reading

Tech

The DOJ is reportedly prepping an antitrust suit against Google over its ad business – TechCrunch

Published

on


The Department of Justice is preparing a second major antitrust suit against Google, according to new reporting by Bloomberg. The DOJ could sue Google “as soon as next month,” according to the report, which details that the lawsuit will be filed in federal court in either Washington or New York.

Unlike the first major Google antitrust case the federal government initiated during the Trump administration, the new lawsuit would focus on the company’s command of the digital ad market. Bloomberg reports that DOJ antitrust lawyers are in the process of wrapping up interviews with publishers after “years of work” that will ultimate culminate in the coming lawsuit.

In 2020, the DOJ sued the tech titan over its dominance in the online search market, accusing the company of “unlawfully maintaining monopolies in the markets for general search services, search advertising, and general search text advertising in the United States.”

At the time, Google pushed back against the suit, arguing that consumers use its product because it is superior, not because they don’t have alternatives.

The Biden administration went out of its way to name a prominent Google critic, Jonathan Kanter, to lead the DOJ’s antitrust division. In a 2016 NYT op-ed, Kanter argued that Google is notorious for using an anti-competitive “playbook” to cut off the oxygen supply to its competitors.

The first Google antitrust suit was filed during Trump’s tenure, but the Biden administration inherited that framework — a rare bit of policy continuity between the Trump and Biden White House —and is still working to hold the tech giant accountable for the anti-competitive behavior that cemented its dominance over the last decade.



Source link

Continue Reading

Trending