Page 2,943«..1020..2,9422,9432,9442,945..2,9502,960..»

After merger, College Park startup IonQ plans to go public with $2 billion valuation – The Diamondback

IonQ, a quantum computing startup born in College Park, announced Monday that it would likely soon become the first publicly traded company to specialize in commercialized quantum computing.

The company plans to file paperwork with the Securities Exchange Commission in the next week, which will allow it to go public on the New York Stock Exchange through an acquisition deal that would set the valuation of the combined entity to nearly $2 billion.

The ability to become a public company gives us access to a huge capital base, and that will allow us to spend more time building our system, deploying them for useful application, said Chris Monroe, IonQs founder and a physics professor at the University of Maryland. We can start to do our own research and development We can do more risky things.

Monroe and co-founder Junsang Kim formed IonQ with the goal of taking quantum computing into the market. They initially received $2 million in seed funding from New Enterprise Associates, giving them a license to lab technology from the University of Maryland and Duke University. From there, they were able to raise tens of millions of dollars in funding from companies like Samsung and Mubadala, and partnered with Amazon Web Services and Microsoft.

[Gov. Hogan names College Park quantum computing company one of top state start-ups]

The company going public was made possible by a planned merger with a blank-check firm, dMY Technology Group Inc. III.

If it goes through, the merger will result in over $650 million in gross proceeds, including $350 million from private investors, according to a press release from IonQ. Combined with the $84 million the company has raised in venture capital funding, the deal would place IonQs total earnings at about $734 million.

The transition to quantum computing is unprecedented, Monroe said, and it will allow people to solve problems that a regular computer often cant.

Some problems like optimizing a fleet of trucks or discovering medicines have too many variables to solve with regular computing. But at the quantum level, more information can be handled, Monroe said, making it radically different from todays computing.

University President Darryll Pines, formerly the dean of the engineering school, explained that classical computing uses a stream of electrical pulses called bits, which represent 1s and 0s, to store information. However, on the quantum scale, subatomic particles known as qubits are used to store information, greatly increasing the speed of computing.

IonQs approach to researching quantum computing has been rooted in university-led research. Quantum physics has strange rules that arent always accepted in the engineering world, Monroe said, so many of these laws have become the domain of research at universities and national laboratories.

And this university especially, with its proximity to Washington, D.C., has one of the biggest communities of quantum scientists, Monroe said.

We have students and postdocs and all kinds of researchers on Marylands campus studying the field, and at IonQ, weve hired many of them, Monroe said. And thats a huge advantage for us.

As a company with about 60 employees, some of whom attended this university, IonQ has become a pioneer in quantum computing. In October, Peter Chapman, IonQs CEO and president, announced the companys newest 32-qubit computer, the most powerful quantum computer on the market.

And in November, Maryland Gov. Larry Hogan named IonQ one of the states top 20 startup companies in the state.

[Women of color in UMD community are making it as entrepreneurs despite challenges]

The biggest advantage for IonQ has been its technology, Monroe said. Companies like IBM, Google or Microsoft use silicon to build their computers but IonQ uses individual atoms, which, unlike silicon, float over a chip in a vacuum chamber.

That technology has been perfected at this university, Monroe said, and IonQ has a concrete plan over the next five years to manufacture quantum computer modules and wire them together.

By 2030, 20 percent of global organizations whether in the public or private sector are expected to budget for quantum-computing projects, according to Gartner Inc., a global research and advisory firm. That number is up from less than 1 percent in 2018, according to Gartner.

Niccolo de Masi, CEO of dMY, said in IonQs press release that he expects the quantum computing industry to grow immensely in the next ten years, with a market opportunity of approximately $65 billion by 2030.

Pines expressed his excitement at seeing a university startup make strides in computing.

Were happy for building the ecosystem from science, to translation, to startup, to possibly developing a product and adding value to society and growing jobs in the state of Maryland, Pines said.

See the article here:
After merger, College Park startup IonQ plans to go public with $2 billion valuation - The Diamondback

Read More..

Quantum computing company D-Wave Systems secures $40M in government funding – IT World Canada

Burnaby, B.C.-based D-Wave Systems is getting $40 million from the federal government to help advance its efforts in the development of quantum computing.

The funding comes from Ottawas Strategic Innovation Fund to support a $120 million project to advance D-Waves hardware and software.

Quantum will help us quickly solve problems that would have otherwise taken decades, said Franois-Philippe Champagne, Minister of Innovation, Science and Industry, told reporters during a virtual press briefing for the announcement.

In a separate release, he added that the funding will will help place Canada at the forefront of quantum technology development, and will create new jobs and opportunities to help Canadians and advance the economy.

A brief history (so far) of quantum computing [PART 1]

A brief history (so far) of quantum computing [PART 3]

A brief history (so far) of quantum computing [PART 2]

D-Wave is the first company to offer a commercially available quantum computer but is still only in the early stages of building a sustainable business after 20 years of development and more than USD $300 million in funds raised.

D-Wave promoted Silicon Valley veteran executive Alan Baratz to chief executive officer last year, replacing Vern Brownell. The company also experienced other changes at the top of the corporate ladder and has parted ways with long-time board members.

Jim Love, Chief Content Officer, IT World Canada

See the original post here:
Quantum computing company D-Wave Systems secures $40M in government funding - IT World Canada

Read More..

Europe moves to exclude neighbors from its quantum and space research – Science Magazine

A department overseen by European Union research commissioner Mariya Gabriel wants to safeguard strategic research by barring non-EU researchers.

By Nicholas WallaceMar. 11, 2021 , 4:25 PM

In a sign of growing national tensions over the control of strategic research, the European Commission is trying to block countries outside the European Union from participating in quantum computing and space projects under Horizon Europe, its new research funding program.

The proposed calls, which must still be approved by delegates from the 27 EU member states in the coming weeks, would shut out researchers in countries accustomed to full access to European research programs, including Switzerland, the United Kingdom, and Israel. European Economic Area (EEA) countries Norway, Lichtenstein, and Iceland would be barred from space research calls while remaining eligible for quantum computing projects.

Research advocates see the proposed restrictions as self-defeating for all parties, including the European Union. It would be a classic lose-lose, with researchers in all countries having to work harder, and spend more, to make progress in these fields, says Vivienne Stern, director of UK Universities International. The unexpected news has upset some leaders of existing collaborations and left them scrambling to find out whether they will need to exclude partnersor even drop out themselvesif they want their projects to be eligible for further funding. It is really a pity because we have a tight and fruitful relationship with our partners in the U.K., says Sandro Mengali, director of the Italian research nonprofit Consorzio C.R.E.O. and coordinator of an EU-funded project developing heat shields for spacecraft.

In 2018, when the European Commission first announced plans for the 85 billion, 7-year Horizon Europe program, it said it would beopen to the world. Switzerland, Israel, the EEA nations, and other countries have long paid toassociate with EU funding programs like Horizon Europegiving their researchers the right to apply for grants, just like those in EU member states. After leaving the European Union,the United Kingdom struck a dealin December 2020 to join Horizon Europe, which put out its first grant calls last month through the European Research Council.

But more recently,strategic autonomy andtechnological sovereignty have become watchwords among policymakers in Brussels, who argue the European Union should domestically produce components in key technologies, such as quantum computers and space technology. Those views influenced the Commissions research policy department, overseen by EU research commissioner Mariya Gabriel, which drafted the calls and their eligibility rules,first revealed by Science|Business. The draft says the restrictions are necessary tosafeguard the Unions strategic assets, interests, autonomy, or security.

Its a bit of a contradiction, says a Swiss government official who asked to remain anonymous because of the sensitivity of forthcoming discussions.You want to open the program to the world and work with the best. But the core group of associated countries with whom youre used to working, suddenly you exclude them and force them to work with the competitors. The official says the Commission gave no warnings the proposal was coming but believes the combination of Brexit and the COVID-19 crisis, in which Europe has struggled to secure access to vaccines, masks, and other equipment, may have further spurred Europe to guard its technologies. Negotiations on Swiss membership in Horizon Europe have not begun, but the country intends to join.

The restrictions affect 170 million in funding that could be available in the next few months. The affected areas include quantum computing, quantum communications, satellite communications, space transport, launchers, andspace technologies for European non-dependence and competitiveness. Projects relating to the Copernicus Earth-observation system and the Galileo satellite navigation programs would remain largely open to associated countries.

Shutting out the associated countries would be alost opportunity and could slow progress in quantum computing, says Lieven Vandersypen, a quantum nanoscientist at the Delft University of Technology.To me, it doesnt make sense. Vandersypen contributes to an EU-funded project that is investigating how to create the basic bits of a quantum computer from cheap and readily available silicon. The project includes U.K. and Swiss researchers at University College London and the University of Basel.They are in there for a good reason, Vandersypen says.They bring in really valuable expertise. With a few years left on the grant, the project isn't in any immediate danger. But the exclusions are bad for long-term planning, Vandersypen says.

Non-EU researchers working on a 150 million European quantum flagship initiative set up in 2018 are also upset by the sudden reversal and wonder about their future status. We discuss with our partners in Europe, they ask us, Can you join?And we dont knowthats probably the worst thing, says Hugo Zbinden, a quantum physicist at the University of Geneva and coordinator of one of these flagship projects, QRANGE, which is investigating how a quantum random number generator can be used to improve encryption.

The restrictions are not yet set in stone; national delegates could reject the draft calls and ask the Commission to open them up. But member states accepted the legal basis for the restrictions last year, when they agreed to the Horizon Europe legislation.Of course, you hope that we will be in, Zbinden says. For the time being, we are waiting for some news.

More here:
Europe moves to exclude neighbors from its quantum and space research - Science Magazine

Read More..

After year of reset expectations, D-Wave secures $40-million from Ottawa for quantum computing – The Globe and Mail

D-Wave is the first company to offer a commercially available quantum computer.

Reuters

One of Canadas most heavily financed technology development companies, quantum computer maker D-Wave Systems Inc., has secured a $40-million financial contribution from the federal government.

The funding, through Ottawas Strategic Innovation Fund, follows a year of reset expectations for D-Wave, a leader in the global race to develop computers whose chips draw their power by harnessing natural properties of subatomic particles to perform complex calculations faster than conventional computers.

Burnaby, B.C.-based D-Wave is the first company to offer a commercially available quantum computer, but after 20-plus years of development and more than US$300-million in funds raised, it is still in the early stages of building a sustainable business.

Story continues below advertisement

Last year D-Wave promoted Silicon Valley veteran executive Alan Baratz to chief executive officer, replacing Vern Brownell, to step up commercialization efforts. The company also parted ways with other top executives and long-time board members.

Mr. Baratz, who led Sun Microsystems Inc.s effort in the 1990s to transform Java from a nascent programming language into the internets main software-writing platform, directed D-Wave to stop selling its shed-sized computers, which listed for US$15-million and had just a handful of customers including NASA, Google, Lockheed Martin and the U.S. Los Alamos National Laboratory.

Instead, D-Wave has focused on selling online access to the technology and expanded its software applications, which Mr. Baratz had started developing after joining as chief product officer in 2017. Customers including Volkswagen and biotechnology startups have used D-Waves technology to find answers to dense optimization problems, such as improving traffic flows in big cities, identifying proteins that could become breakthrough drugs and improving the efficiency of painting operations on vehicle production assembly lines.

D-Wave also completed a costly US$40-million refinancing last year that wiped out most of the value of some long-time investors, including the U.S. Central Intelligence Agencys venture capital arm, Amazon CEO Jeff Bezos and fund giant Fidelity Investments. The capital restructuring cut D-Waves valuation to less than US$170-million, down from US$450-million, The Globe reported in October. Investors that ponied up, including Public Sector Pension Investment Board, D-Waves top shareholder, BDC Capital and Goldman Sachs, maintained their relative stakes, limiting their writedowns.

Over the years [D-wave has] had to raise money and more money and more money ... and as such you end up getting diluted over time because every third quarter it seems like you run out of the $50-million that you raised, Kevin Rendino, CEO and portfolio manager of D-Wave investor 180 Degree Capital Corp., told his investors last November. D-Wave has been a source of bitter disappointment for all of us.

Meanwhile, D-Wave faces years and tens of millions of dollars more in costs to continue developing its core technology. The government aid will support a $120-million project to advance D-Waves hardware and software and will help place Canada at the forefront of quantum technology development, and will create new jobs and opportunities to help Canadians and advance the economy, Franois-Philippe Champagne, Minister of Innovation, Science and Industry, said in a release.

During a press conference to discuss the funding, the minister was asked if the government would review potential takeovers of quantum computing companies, as the U.S. government is considering doing. Mr. Champagne provided a non-committal response, saying Im sure you would expect us to be eyes wide open when it comes to whatever we would need to take in terms of steps to protect.[intellectual property] that has been developed in Canada.

Story continues below advertisement

Were always out there looking at how we can improve to make sure that new technologies and inventions and improvements and IP that has been developed in Canada stays in Canada.

D-Wave faces a slew of competitors including Google, Microsoft, Intel, IBM and Honeywell that are also trying to build the first quantum machine that can outperform classical or conventional computers. In addition, a new class of startups including Torontos Xanadu Quantum Technologies Inc. and College Park, Md.-based IonQ Inc. believe they can build quantum chips that dont have to be supercooled to function, as D-Waves system and others in development do. IonQ said this week it would go public through a special purpose acquisition company become the first publicly traded quantum computing-focused company.

Mr. Baratz said in an emailed statement that since D-Waves launch last September of of its latest quantum chip and expanded efforts to sell online access to its computers weve been encouraged by the positive customer response to the value delivered by a quantum system designed for practical, in-production business-scale applications. Were eager to see even more developers, academics, and companies leverage it to solve larger, more complex problems.

Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.

View post:
After year of reset expectations, D-Wave secures $40-million from Ottawa for quantum computing - The Globe and Mail

Read More..

Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI? – Analytics India Magazine

As per Open AI data, the amount of computational power needed to train large AI models has grown massively doubling every three and a half months since 2021. GPT-3, which requires 3.14E23 FLOPS of computing for training, is a good case in point.

Typically, to carry out high-performance computing tasks, conventional AI chips are equipped with transistors that work with electrons. Although they perform a wide array of complex high performing tasks, energy consumption and engineering glitches pose a challenge. Thus, the growing need for computing power has set researchers on a quest to find a workaround to boost these chips power without increasing energy consumption.

And thats when experts turned to photons and light particles that can easily substitute electrons in AI chips to reduce the heat, leading to a massive reduction in energy consumption and a dramatic upgrade in processor speed.

While electrons perform calculations by reducing the information to a series of 1s and 0s, photonic chips split and mix beams of light within tiny channels to carry out the tasks. Compared to regular AI chips, photonics chips are only designed to perform a certain kind of mathematical calculation, critical for running large AI models.

Lightmatter, an MIT-backed startup, last year developed an AI chip Envise that leverages photons (light particles) to perform computing tasks.

Lights travel faster than electrons. The concept of using light as a substitute for carrying out heavy tasks (aka photonics computing/optical computing) dates back to the 1980s, when Nokia Bell Labs, an American industrial research and scientific development company, tried to develop a light-based processor. However, due to the impracticality of creating a working optical transistor, the concept didnt take off.

We experience optical technology in cameras, CDs, and even in Blue-Ray discs. But these photons are usually converted into electrons to deploy in chips. Four decades later, photonic computing gained momentum when IBM and researchers from the University of Oxford Muenster developed the system that uses light instead of electricity to perform several AI model-based computations.

Alongside, Lightmatters new AI chip has created a buzz in the industry. According to the company website, Envise can run the largest neural networks three times higher inferences/second than the Nvidia DGX-A100, with seven times the inferences/second/Watt on BERT-Base with the SQuAD dataset.

Japan-based NTT company has also been developing an optical computer believed to outpace quantum computing to solve optimisation problems. Last year, Chinese quantum physicist, Chao-Yang Lu, has also announced light-based quantum computing.

Other companies like US-based Honeywell and IonQ have also been working around the issue by using trapped ions.

Such developments have led the experts to believe photonics computing will gain ground once the big tech companies throw their weight behind it and understand the importance of using light for their AI chips.

On the other hand, like any other remarkable technology, photonics computing also comes with certain challenges. Despite its less energy-consumption, photons chips are considered less accurate and precise than electron-based chips. Much of this could be attributed to its analogue-based calculations, making it perfect for running pre-trained models and deep neural networks.

On the designing aspect, silicon-based computer chips dont go well with photo particles that limit their usage in computing.

The cost issues and environmental impact of digital chips might set the stage for photonics computing to rise as a substitute. With startups like Lightmatter and giants like IBM committing resources to this computing paradigm, AI might get a photonic boost.

Read the rest here:
Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI? - Analytics India Magazine

Read More..

Working at the intersection of data science and public policy | Penn Today – Penn Today

One of the ideas you discuss in the book is algorithmic fairness. Could you explain this concept and its importance in the context of public policy analytics?

Structural inequality and racism is the foundation of American governance and planning. Race and class dictate who gets access to resources; they define where one lives, where children go to school, access to health care, upward mobility, and beyond.

If resource allocation has historically been driven by inequality, why should we assume that a fancy new algorithm will be any different? This theme is present throughout the book. Those reading for context get several in-depth anecdotes about how inequality is baked into government data. Those reading to learn the code, get new methods for opening the algorithmic black box, testing whether a solution further exasperates disparate impact across race and class.

In the end, I develop a framework called algorithmic governance, helping policymakers and community stakeholders understand how to tradeoff algorithmic utility with fairness.

From your perspective, what are the biggest challenges in integrating tools from data science with traditional planning practices?

Planning students learn a lot about policy but very little about program design and service delivery. Once a legislature passes a $50 million line item to further a policy, it is up to a government agency to develop a program that can intervene with the affected population, allocating that $50 million in $500, $1,000 or $5,000 increments.

As I show in the book, data science combined with governments vast administrative data is good at identifying at-risk populations. But doing so is meaningless unless a well-designed program is in place to deliver services. Thus, the biggest challenge is not teaching planners how to code data science but how to consider algorithms more broadly in the context of service delivery. The book provides a framework for this by comparing an algorithmic approach to service delivery to the business-as-usual approach.

Has COVID-19 changed the way that governments think about data science? If so, how?

Absolutelyspeaking of service delivery, data science can help governments allocate limited resources. The COVID-19 pandemic is marked entirely by limited resources: From testing, PPE, and vaccines to toilet paper, home exercise equipment, and blow-up pools (the latter was a serious issue for my 7-year-old this past summer).

Government failed at planning for the allocation of testing, PPE, and vaccines. We learned that it is not enough for government to invest in a vaccine; it must also plan for how to allocate vaccines equitably to populations at greatest risk. This is exactly what we teach in Penns MUSA Program, and I was disappointed at how governments at all levels failed to ensure that the limited supply of vaccine aligned with demand.

We see this supply/demand mismatch show up time and again in government, from disaster response to the provision of health and human services. I truly believe that data can unlock new value here, but, again, if government is uninterested in thinking critically about service delivery and logistics, then the data is merely a sideshow.

What do you hope people gain by reading this book?

There is no equivalent book currently on the market. If you are an aspiring social data scientist, this book will teach you how to code spatial analysis, data visualization, and machine learning in R, a statistical programming language. It will help you build solutions to address some of todays most complex problems.

If you are a policymaker looking to adopt data and algorithms into government, this book provides a framework for developing powerful algorithmic planning tools, while also ensuring that they will not disenfranchise certain protected classes and neighborhoods.

See the original post here:

Working at the intersection of data science and public policy | Penn Today - Penn Today

Read More..

Jupyter has revolutionized data science, and it started with a chance meeting between two students – TechRepublic

Commentary: Jupyter makes it easy for data scientists to collaborate, and the open source project's history reflects this kind of communal effort.

Image: iStockphoto/shironosov

If you want to do data science, you're going to have to become familiar with Jupyter. It's a hugely popular open source project that is best known for Jupyter Notebooks, a web application that allows data scientists to create and share documents that contain live code, equations, visualizations and narrative text. This proves to be a great way to extract data with code and collaborate with other data scientists, and has seen Jupyter boom from roughly 200,000 Notebooks in use in 2015 to millions today.

Jupyter is a big deal, heavily used at companies as varied as Google and Bloomberg, but it didn't start that way. It started with a friendship. Fernando Prez and Brian Granger met the first day they started graduate school at University of Colorado Boulder. Years later in 2004, they discussed the idea of creating a web-based notebook interface for IPython, which Prez had started in 2001. This became Jupyter, but even then, they had no idea how much of an impact it would have within academia and beyond. All they cared about was "putting it to immediate use with our students in doing computational physics," as Granger noted.

Today Prez is a professor at University of California, Berkeley, and Granger is a principal at AWS, but in 2004 Prez was a postdoctoral student in Applied Math at UC Boulder, and Granger was a new professor in the Physics Department at Santa Clara University. As mentioned, they first met as students in 1996, and both had been busy in the interim. Perhaps most pertinently to the rise of Jupyter, in 2001 Prez started dabbling in Python and, in what he calls a "thesis procrastination project," he wrote the first IPython over a six-week stretch: a 259-line script now available on GitHub ("Interactive execution with automatic history, tries to mimic Mathematica's prompt system").

SEE:Top 5 programming languages for data scientist to learn (free PDF)(TechRepublic)

It would be tempting to assume this led to Prez starting Jupyter--it would also be incorrect. The same counterfactual leap could occur if we remember that Granger wrote the code for the actual IPython Notebook server and user interface in 2011. This was important, too, but Jupyter wasn't a brilliant act by any one person. It was a collaborative, truly open source effort that perhaps centered on Prez and Granger, but also people like Min Ragan-Kelley, one of Granger's undergraduate students in 2005, who went on to lead development of IPython Parallel, which was deeply influential in the IPython kernel architecture used to create the IPython Notebook.

However we organize the varied people who contributed to the origin of Jupyter, it's hard to get away from "that one conversation."

In 2004 Prez visited Granger in the San Francisco Bay Area. The old friends stayed up late discussing open source and interactive computing, and the idea to build a web-based notebook came into focus as an extension of some parallel computing work Granger had been doing in Python, as well as Prez's work on IPython. According to Granger, they half-jokingly talked about these ideas having the potential to "take over the world," but at that point their idea of "the world" was somewhat narrowly defined as scientific computing within a mostly academic context.

Years (and a great deal of activity) later, in 2009, Prez was back in California, this time visiting Granger and his family at their home in San Luis Obispo, where Granger was now a professor. It was spring break, and the two spent March 21-24 collaborating in person to complete the first prototype IPython kernel with tab completion, asynchronous output and support for multiple clients.

By 2014, after a great deal of collaboration between the two and many others, Prez, Granger and the other IPython developers co-founded Project Jupyter and rebranded the IPython Notebook as the Jupyter Notebook to better reflect the project's expansion outwards from Python to a range of other languages including R and Julia. Prez and Granger continue to co-direct Jupyter today.

"What we really couldn't have foreseen is that the rest of the world would wake up to the value of data science and machine learning," Granger stressed. It wasn't until 2014 or so, he went on, that they "woke up" and found themselves in the "middle of this new explosion of data science and machine learning." They just wanted something they could use with their students. They got that, but in the process they also helped to foster a revolution in data science.

How? Or, rather, why was it that Jupyter has helped to unleash so much progress in data science? Rick Lamers explained:

Jupyter Notebooks are great for hiding complexity by allowing you to interactively run high level code in a contextual environment, centered around the specific task you are trying to solve in the notebook. By ever increasing levels of abstraction data scientists become more productive, being able to do more in less time. When the cost of trying something is reduced to almost zero, you automatically become more experimental, leading to better results that are difficult to achieve otherwise.

Data science is...science; therefore, anything that helps data scientists to iterate and explore more, be it elastic infrastructure or Jupyter Notebooks, can foster progress. Through Jupyter, that progress is happening across the industry in areas like data cleaning and transformation, numerical simulation, exploratory data analysis, data visualization, statistical modeling, machine learning and deep learning. It's amazing how much has come from a chance encounter in a doctoral program back in 1996.

Disclosure: I work for AWS, but the views expressed herein are mine.

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

See original here:

Jupyter has revolutionized data science, and it started with a chance meeting between two students - TechRepublic

Read More..

Gartner: AI and data science to drive investment decisions rather than "gut feel" by mid-decade – TechRepublic

Turns out, "calling it from the gut," may become a strategy of the past as data increasingly drives decision-making. But how will these data-driven approaches change investment teams?

Image: iStock/metamorworks

In the age of digital transformation, artificial intelligence and data science are allowing companies to offer new products and services. Rather than relying on human-based intuition or instincts, these capabilities provide organizations with droves of data to make more informed business decisions.

Turns out, "calling it from the gut," as the adage goes, may become an approach of the past as data increasingly drives investment decisions. A new Gartner report predicts that AI and data science to drive investment decisions rather than "gut feel" by mid-decade.

"Successful investors are purported to have a good 'gut feel'the ability to make sound financial decisions from mostly qualitative information alongside the quantitative data provided by the technology company," said Patrick Stakenas, senior research director at Gartner in a blog post. "However, this 'impossible to quantify inner voice' grown from personal experience is decreasingly playing a role in investment decision making."

Instead, AI and data analytics will inform more than three-quarters of "venture capital and early-stage investor executive reviews," according to a Gartner report published earlier this month.

"The traditional pitch experience will significantly shift by 2025, and tech CEOs will need to face investors with AI-enabled models and simulations as traditional pitch decks and financials will be insufficient," Stakenas said.

SEE:TechRepublic Premium editorial calendar: IT policies, checklists, toolkits, and research for download(TechRepublic Premium)

Alongside data science and AI, crowdsourcing will also help play a role in "advanced risk models, capital asset pricing models and advanced simulations evaluating prospective success," per Gartner. While the company expects this data-driven approach as opposed to an intuitive approach to become the norm for investors by mid-decade, the report also notes a specific use-case highlighting using these methods.

Correlation Ventures uses information gleaned from a VC financing and outcomes database to "build a predictive data science model," according to Gartner, allowing the fund to increase both the total number of investments and the investment process timeline "compared with traditional venture investing."

"This data is increasingly being used to build sophisticated models that can better determine the viability, strategy and potential outcome of an investment in a short amount of time. Questions such as when to invest, where to invest and how much to invest are becoming almost automated," Stakenas said.

SEE:Researchers use AI-enabled drones to protect the iconic koala(TechRepublic)

A portion of the report delves into the myriad ways these shifts in investment strategy and decision making could alter the skills venture capital companies seek and transform the traditional roles of investment managers. For example, Gartner predicts that a team of investors "familiar with analytical algorithms and data analysis" will augment investment managers.

These new investorswho are "capable of running terabytes of signals through complex models to determine whether a deal is right for them"will apply this information to enhance "decision making for each investment opportunity," according to the report.

The report also includes a series of recommendations for tech CEOs to develop in the next half-decade. This includes correcting or updating quantitative metrics listed on social media platforms and company websites for accuracy. Additionally, to increase a tech CEO's "chances of making it to an in-person pitch" they should consider adapting leadership teams and ensure "online data showcases diverse management experience and unique skills," the report said.

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

More:

Gartner: AI and data science to drive investment decisions rather than "gut feel" by mid-decade - TechRepublic

Read More..

Postdoctoral Position in Transient and Multi-messenger Astronomy Data Science in Greenbelt, MD for University of MD Baltimore County/CRESST II -…

Postdoctoral Position in Transient and Multi-messenger AstronomyData ScienceThe High Energy Astrophysics Science Archive Research Center (HEASARC) and Time- domain Astronomy Coordination Hub (TACH) at NASAs Goddard Space Flight Center (GSFC) invite applications for postdoctoral research positions in the fields of transient and/or multi- messenger astronomy.Applicants should have a strong astronomy research track record and also deep expertise in the technical disciplines of full-stack software development, cloud computing, and data visualization. Experience in machine learning and/or time-series databases would also be beneficial.

Successful applicants will join HEASARC/TACH and have a central role in shaping Goddards multi-messenger science output. This position is funded at 100% FTE. Approximately half of the applicants time will be devoted to HEASARC/TACH, including activities such as software engineering, shaping next-generation Kafka-based NASA astronomy alert systems, pipelinedevelopment, and collaboration with Goddard-supported missions. The remainder of the applicants time is available for self-driven research projects.

GSFC is home to over 100 Ph.D. astronomers, including project teams for Swift, Fermi, NICER, NuSTAR, TESS, JWST, and Roman, as well as ample computational resources. GSFC is also a member of the LIGO Scientific Collaboration. Through the Joint Space-Science Institute (JSI), GSFCis a partner in the Zwicky Transient Facility project. The successful applicants will also have the opportunity to apply for time on 4.3m Lowell Discovery Telescope in Happy Jack, AZ.

The positions are for two years, renewable for a third year upon mutual agreement, and will be hired through the University of Maryland, Baltimore County on the CRESST II collaborative agreement with GSFC. The nominal starting date is in Fall 2021, but alternate dates are possible depending on availability. Candidates must have a Ph.D. in astronomy, physics, or a related field by the date of appointment.

Candidates should provide a cover letter, CV (including publication list), and a 3-page statement of research interests. Short-listed candidates will be asked to supply three letters of reference at a later date. Completed applications received by Friday, April 30, 2021 will receive full consideration. All application materials and inquiries should be sent to:

Transient and Multi-messenger Astronomy Data Science Postdoctoral PositionCRESST/UMBCMail Code 660.8, NASA/GSFC Greenbelt, MD 20771, orVia e-mail to katherine.s.mckee@nasa.gov

Salary and benefits are competitive, commensurate with experience and qualifications. For more information about the proposed research, contact Dr. Judith Racusin (judith.racusin@nasa.gov). For information on CRESST II or UMBC, contact Dr. Don Engel (donengel@umbc.edu).

UMBC is an equal opportunity employer and welcomes all to apply. EOE/M/F/D/V. The TACH project and NASA/GSFC are committed to building a diverse group and encourage applications from women, racial and ethnic minorities, individuals with disabilities and veterans.

Read more:

Postdoctoral Position in Transient and Multi-messenger Astronomy Data Science in Greenbelt, MD for University of MD Baltimore County/CRESST II -...

Read More..

DefinedCrowd CEO Daniela Braga on the future of AI, training data, and women in tech – GeekWire

DefinedCrowd CEO Daniela Braga. (Drio Branco Photo)

Artificial intelligence is the fourth industrial revolution and women had better play a prominent role in its development, says Daniela Braga, CEO of Seattle startup DefinedCrowd.

We left the code era to men over the last 30 years and look at where it got us, Braga told attendees of the recent Women in Data Science global conference. Ladies lets lead the world to a better future together.

Technology has of course led to amazing advancements in health, communications, education and entertainment, but it has also created a more polarized and extremist society, spread dangerous misinformation and excluded swaths of the population from participation. A 2018 study by Element AI found that only 13% of U.S. AI researchers were women.

Braga thinks we can do better. She is a co-founder of DefinedCrowd, an AI training data technology platform that launched in December 2015. Braga took over as CEO in mid-2016. The company is ranked No. 21 on the GeekWire 200, our list of top Pacific Northwest tech startups, and has reeled in $63.6 million in venture capital, including a $50 million round raised last year.

We caught up with Braga after the conference to learn how AI is usurping coding; the need to impose ethics and regulations on where it takes us; and the need for more women in the industry and in AI leadership. Here are some key takeaways:

We spent five centuries in the print era, 30 years in software and now AI is on a path to supplant software, Braga said. And while coding and programmers drove software development, its data and data scientists that produce AI.

You dont program rules, you teach AI with data that is structured and [theres] a lot of it, Braga said. The data allows us to train a brain, an artificial brain, in a week instead of what it used to take us months to code.

And its so much more powerful. Traditional coding, which is essentially built from if then decision making rules, isnt capable of controlling complex tasks like self-driving cars or virtual assistants that require subtle assessments and decision making.

Were still at the dawn of AI, Braga said, or whats called narrow AI. In these early days, the field needs to be incorporating rules and standards to make sure that AI is used in ways that are ethical, unbiased and protect privacy. Oversight is needed at an international level that brings in a diversity of voices.

We need an alliance, almost like a United Nations for AI, she said.

The data used to train AI needs to be high quality, which for Braga means its accurate, representative or unbiased. It also should be monitored, anonymized so it cant be traced to its sources, and people are consenting in providing their information. Braga admittedly has a vested interest in this matter, as her companys business it to provide the data that companies use to train their AI. DefinedCrowds focus is speech and natural language processing.

In an infamous case of what can happen when AI is trained on bad data, Microsofts AI chatbot named Tay was quickly corrupted in 2016 when online users fed it racist, misogynistic language and conspiracy theories that the personal assistant parroted back to the public.

While were in narrow AI now, next steps are general AI and super AI, Braga said. As the technology matures, the different AI systems will be able to communicate with each other. Navigation, for example, will mix with voice interaction combined with text messaging. Home and work AI domains will talk together.

There are some people who say when you start interlinking so many things you will have an AI that may become sentient, Braga said, creating technology such as the personal assistant in the movie Her who is so lifelike and charming that the protagonist falls in love.

The super AI is when AI is smarter, thinking faster, thinking better than humans, Braga said. So that is the part that is science fiction, where the machine will take over the world. Were very far from that.

Women bring emotional intelligence that technology should have, Braga said. Its just that emotional intelligence component, that creativity, that warmth, that should resemble more a human that aspect does not come through built by men alone.

Data science is different from traditional software engineering. While the latter focused on programming languages, math and statistics, work in AI incorporates linguistics, psychology and ethics to a greater degree.

DefinedCrowd is trying to build a diverse workforce and has an office in Portugal, where Braga was born and raised. The companys staff is about 32% female, but its difficult to recruit qualified women, particularly for senior roles. Whats even tougher, Braga said, is finding women at her level for mentoring and support.

There are a handful of founder/CEOs at AI-focused companies, including Daphne Koller of insitro and Rana el Kaliouby of Affectiva. And only 22 women who hold the title of founder/CEO have taken their companies public among thousands of IPOs over the decades, according to Business Insider.

I always have a super hard time finding women to look up to because it just doesnt exist. Im basically paving my way by myself. I dont have role models, Braga said. Its really hard to not have a way to bounce ideas within a safe circle.

Read more here:

DefinedCrowd CEO Daniela Braga on the future of AI, training data, and women in tech - GeekWire

Read More..