Page 2,119«..1020..2,1182,1192,1202,121..2,1302,140..»

From "data for good" to "data for impact" – NationSwell

Data science has the power to accelerate social and environmental

progress. Yet according to Salesforce.orgs Nonprofit Trends Report, only 22% of social impact organizations have achieved high data maturity today. As a result, the data for good sector has the tendency to rely too heavily on creating flashy, new tools to fix problems. But these tools often fail to move the needle on real impact, and many underserved communities and the non-profit organizations that serve them continue to need better access to skills and capacity to leverage these innovations.

social and environmental progress. Yet according to Salesforce.orgs Nonprofit Trends Report, only 22% of social impact organizations have achieved high data maturity today. As a result, the data for good sector has the tendency to rely too heavily on creating flashy, new tools to fix problems. But these tools often fail to move the needle on real impact, and many underserved communities and the non-profit organizations that serve them continue to need better access to skills and capacity to leverage these innovations.

In order to actually deliver impact through data science at scale, what needs to change across our sector?At a recent data.org event, we convened social impact organizations, funders, and data science leaders to explore ways to address this challenge. We sought participants insights and gained a clearer sense of what it will take for data to be accessed and applied for good.What follows are three calls to action that emerged from our conversation. We believe that realizing these calls would catalyze a shift toward scalable, sustainable, and genuinely community-driven projects that help the social good sector use data science to realize impact.

Its easy to fall for the flash and glimmer of a new AI solution but we cant stop there. We have to deepen our understanding of the problems that we are trying to solve, and our commitment to working with the people and communities that experience real challenges everyday. This might seem like a small shift, but its seismic. It pushes us beyond thinking only about the mechanics of a technical solution and instead challenges us to ask how new technology can change the balance of power in favor of people and communities that have been systematically excluded or harmed. To be clear, passion for new technical solutions isnt bad. Many problems we face in the social impact sector do require innovation and creativity. But simply having a new approach doesnt guarantee actual impact. Our metric for success cannot simply be that we delivered a solution. That solution must meaningfully contribute to reducing suffering or improving equity.

Doing this isnt easy. It requires technical experts to diversify their networks and engage with humility. True understanding of social issues cannot be done without community experience and partnership. Creating technology far from the community it purports to benefit rarely works. Instead, we must partner with communities to develop solutions that are responsive and designed to scale in the real world.Funders play a critical role in shifting the focus from novel solutions to actual impact. Much of the innovation funding ecosystem currently focuses on building new things instead of investing in long-term capacity building and problem solving. As solution builders, it can be easy to lose focus on the impact you seek in favor of amplifying what will be most attractive to funders. Change makers and funders bear a joint responsibility to honor the complexity and context of the problem at hand and continually seek to deliver impact, not getting distracted by a desire to over index on what might be considered the shiny, data-driven technology of the moment. Disciplined focus on what specific problem data science is helping you understand or address at any one moment in time is essential when unlocking the power of this technology. Without a disciplined approach, the use of data science can be distracting and potentially dilute or derail your impact.So, we must follow the problem. And one of the things we might learn as we follow it is that the problem is not solvable because of a single data science method. For people coming from data science backgrounds and engineering backgrounds, that means that you might actually have to admit that you maybe arent the biggest part of the solution. And that reflection, and the maturity around that reflection, is absolutely critical for figuring out what you can do, for figuring out an angle in, for figuring out an approach or an impact model that actually does speak to the real problem. You have to identify what problem it is that you are capable of solving and find true product-impact fit. While following the problem seems intuitive, it is inherently very difficult. But its urgently necessary if we want to advance and truly use data to drive impact rather than just giving rise to pilots that explore emerging technologies. As social impact officers, implementers, and funders, we must honor the complexity of the problems that we seek to solve, and be committed enough to fall in love with the actual problems themselves.

Advancing our sector also means seeing and supporting projects through to the very end, to where people are applying it to their everyday lives or organizations. It is much easier to build a new product and get it to a Minimal Viable Product stage. But then, to deliver on the impact, you have to actually use the product over time. You have to build the muscle for iteration. Embracing iteration helps to solve one key challenge social impact organizations face: a lack of clarity around the metric for which they are optimizing. In profit-driven business, its much more straightforward: Does a new recommender algorithm, for example, increase engagement, conversions, and then revenue?But for social impact organizations, measurement and agreement on what the key metrics actually are can make this messier. Building a muscle for iteration means you commit to actually looking at the outcomes of deploying a new method, and that youre able to regularly and reliably measure those outcomes at a reasonable cost. And like building muscle in the gym, this process requires trial and error and an ongoing commitment.Funders have traditionally taken a very linear, more short-term approach to supporting solutions providing resources to get to the end of an initial pilot, for instance but the messy nature of achieving impact goals demands that we should be embracing a more iterative mindset and approach. Common case studies for success like BlueConduits data driven approach to helping Flint with its water crisis or GiveDirectlys efforts to use data science to target cash transfers for COVID-19 relief all reflect an iterative narrative, reinforcing the ideal process of idea, implementation, and success, with funding and governmental support at every step of the journey. However, those seamless journeys are the exception, not the rule. The reality of driving impact outcomes is more like life: unpredictable and requiring constant course correction. Imagine an exciting new algorithm that promises to solve hunger in a community. We might expect there to be funding to build the algorithm, have the paper written about it, get press published; but, when it comes to working through the application of it with 20 non-profits with different use cases, we may realize that the algorithm will need continuous refining, and that the exercise of testing and refining will take us in new and unexpected directions around how to effectively serve diverse neighborhoods or, at worst, that no one needs the technology in its initial form, and well have to go back to the drawing board and build something fundamentally different from the initial solution.Thats where our current systems for funding and support can fall apart. So, we need solution builders and funders to anticipate and embrace the 2.0s of the project, the 3.0s, and beyond. Only through the creation of Minimum Viable Products and its testing phase can we understand that component of the problem statement that we can effectively influence, improve, predict, or make more efficient.

Sustaining and scaling data science for impact requires a deep commitment to capacity building and technical education. This capacity building must happen across the ecosystem, from implementing organizations, through to funders. At this stage investing in the capacity of humans is probably the most powerful thing that we can do to move along the transformation curve. Because humans and systems are what actually move the needle on solving problems, investments in human systems ensure that innovation happens at scale, rather than just one thing at a time. Katharine Lucey, who leads Solar Sister, is a perfect example of what you unlock when you invest in the humans and internal capacity behind a solution. With data.orgs support through the Inclusive Growth and Recovery Challenge, she invested in making sure she had data experts on her team and the budget to support them in the long term. As a result, her work in supporting local women entrepreneurs in Africa who work with clean energy has become a model for how data science can help steer social impact. That evolution is the direct result of investments in capacity. As another example of building capacity of partners: The Center for Global Action devises a system for locating and measuring poverty. But the step that actually helped people in poverty was getting money to them, and having policy makers who understood this system and could adapt it and move it through. So the CEGA system of data measurements for poverty was important, but only in as much as it enabled a sophisticated, human-driven administrative process that was actually distributing money.At the end of the day, it will be our subject matter experts who understand the complexity and the context of the challenges faced by the communities seeking to solve problems in their neighborhoods. We have a responsibility to make sure that this type of thinking, learning, and tooling is available. How do we train more? How do we implement more for more people?As problem solvers, and funders of problem solvers, there needs to be more consideration of the patience of capital especially when were talking about product-impact fit and learning around how to fund product roadmaps. We need to be asking not just, What can the technology do? but, How do we train more people? How long can they sustain this work? What else do the people doing this work need? How do we build interdisciplinary teams that have the data skills, technical skills, community insight and subject matter expertise of the problem?

Funders or impact partners shouldnt be afraid if any of this sounds overly ambitious or daunting: its just a different mindset, and different set of knowledge to acquire. We can all do this together but to do it, we must change how we build, fund, train, support, and lead the sector moving forward. We must move from being solutions-focused to being problem-focused, from launch-focused to iteration-focused, and from tech-focused to capacity-focused. These challenges require all of us innovator, funder, and implementer alike to contribute. Theyre complex challenges, but its exactly what data.org was set up to do. For practical information and inspirational ideas to help social impact organizations use data science to solve the worlds biggest problems, check out data.orgs public resource library.

View post:

From "data for good" to "data for impact" - NationSwell

Read More..

Datamatch: The science behind how to find love, companionship this Valentine’s Day – UW Badger Herald

With Valentines Day right around the corner, it feels as though everyone is frantically looking for love. COVID-19, isolations and awkward interactions are a recipe for disaster when looking for love or even companionship.

Datamatch is a college-oriented, online matchmaking service created by a group of Harvard students in 1994. Datamatch is now offered at over 30 colleges and universities around the country, and the University of Wisconsin is lucky enough to have made the list already.

Datamatch uses artificial intelligence algorithms to pair people in a science-based way. The algorithms are extremely accurate hence the years of success behind them.

UW researchers employ virtual reality technology to reimagine scientific explorationIn a world with increasing reliance on technology, the University of Wisconsins Virtual Environments Group has been utilizing virtual reality Read

The service runs a Python script on the night before Valentines Day to generate matches automatically, UW Datamatch president Caelan Kleinhans said. The website is coded using Javascript and React.

Everyones data and everyones answers get compiled in a specific way to match people with other people in ways that they want, Kleinhans said. So it matches people with other people that are looking for the same things, which is really cool. And then it also matches you with people who are similar or different to you based on your answers.

Debunked: Are cold temperatures making you sick?Editors note: Debunked is a science news series that focuses on dismantling common myths and misconceptions with the help of Read

The Datamatch survey is sent out every year Feb. 7, and people have one week to complete the survey. The questions include personal preferences and traits, as well as what someone is looking for in their partner or companion.

Feb. 14, Valentines Day, the survey closes and the matches get sent out early that morning just in time to find love for the biggest day of the year. Everyone then receives around 10 similar matches based on their survey responses and the algorithm. Once these 10 matches are released, users can select any matches that seem of interest to them and mutual matches can set up a date.

UW researchers dive into the power of sea creatures for disease treatmentFish are friends and so are sharks! University of Wisconsin researchers at the LeBeau Lab use sharks as a valuable Read

Once matches are made, participants in Datamatch can go to events that include watching a movie, ice skating and swing dance lessons, according to Mitchell Schroeder, a member of the UW Datamatch chapter.

Those events are kind of geared toward anyone whos done Datamatch, Schroeder said. You can come with your friends, you can come with people you matched with and meet on there. You can really come with anyone. Were just trying to get people together on campus and have a good time.

While you and your uber-compatible date may not be fans of math and science, Datamatch just goes to show that science can be applied in many different ways.

Go here to see the original:

Datamatch: The science behind how to find love, companionship this Valentine's Day - UW Badger Herald

Read More..

Data, Decarbonization, and The New Math of Sustainability – Transmission & Distribution World

Utilities canand MUSTplay a leading role in the clean-energy transition, and in their own sustainable transformation.

Depending on how you look at it, the glass is either half full or half empty. The half full view says that if utilities can rise to the challenge and meet the 2030, 2040 and 2050 sustainability goals they have laid out, worldwide emissions could fall precipitously. The half empty response counters that the utility industry is constrained by regulation that stifles innovation. Adding to the difficulty, the utility industry is constrained by downward cost pressure, meaning large rate hikes are not an option.

As someone who has dedicated 35+-years to a career in the utility space and seen the best and worst tendenciesand everything in betweenin our industry, I may surprise you by saying Im not in the half-full camp. I think theres even more in the glass. The challenge ahead is likely the biggest weve ever faced as an industry, but it is also a once-in-a-lifetime opportunity to grow, radically improve, and make a lasting positive impact on society, all fueled by the following macro-drivers:

Sustainability: Decarbonization efforts in service of reaching net zero carbon emissions are the clarion call of the industry. Mass adoption of renewables at scale wind, solar, battery storage, and an emerging set of new energy sources is in motion. At the same time, collaborating with business and residential customers using distributed energy resources, such as solar and demand response, will only grow in importance. Feeding into all of this is increasing demand for EVs, part of an electrification wave that could drive utility growth for decades.

Safety and Reliability: While utilities are fighting the consequences of climate change more severe fires, droughts, floods, and temperature extremesthey are doing so from a position of weakness, and with aging infrastructure. A smarter, more resilient grid must be built to literally weather the storm, and it will cost billions.

Equity: It has always been an issue, but because of COVID-19 and its ongoing variants, low and moderate income (LMI) residential customers and small and medium sized businesses (SMBs) have gotten clobbered, and they need, and are demanding, more from their utilities to handle energy expenses and gain access to sustainability programs.

But we cant just spend our way out of this. The limited ability of utilities to raise rates fundamentally changes the math of transformation. With hundreds of billions in needed capital investment on one side and affordable rates on the other, O&M costs have moved to the middle, becoming the fulcrum. Simply put, utilities numbers for sustainability, safety, reliability, and equity dont and cant add up without a massive decrease in O&M expense and optimized capital expenditures.

But how is that going to happen? That is literally the multi-billion-dollar question, one that necessitates a fundamentally new-and-improved approach. That approach is rooted in data. The strategic use of data is now a must-have capability for every utility in search of increased customer engagement, infrastructure optimization, and sweeping cost reduction.

Just look at the Sustainability Transformation Plan laid out by Evergy. The plan accelerates Evergys transition to cleaner energy (80% reduction in CO2 emissions by 2050), increases capital investment in critical utility infrastructure to nearly US $5 billion, anticipates 5-6% rate base growth and 6-8% EPS through 2024, and promises increased benefits for customers and communities. And it reduces O&M costs by 25%.

Thats thinking BIG.

But theres a reason this big thinking is more than wishful thinking. The prevalence of AMI and other utility and customer data, the maturation of predictive data science, the evolution of AI and machine learning solutions, and the growing understanding of how to put them all together to move the needle quickly and efficiently means there has never been a better time for utilities to play offense.

Instead of a cadence-based approach to managing the grid, the industry is moving to a data-driven risk-spend efficiency approach to optimize capital investments and O&M spend, managing reliability proactively at a more granular level. In addition to cost savings, this data-driven approach also enables a more collaborative regulatory process I call results-based compliance, which ties spending directly back to reliability measures. It relies on data, not a calendar, to optimize spending on vegetation management, for example.

With data in hand, the time is NOW for utilities to dive in and accelerate their own transformation to becoming a more sustainable utility.

Because utilities matter. They always havejust ask anyone without power or safe drinking waterbut they matter now more than ever. In addition to keeping the lights on, the coolers cooling, and the water flowing, utilities are now poised to play a starring role in the very survival of the planet.

The good news is its doable, and doable now.

Ted Schultz is the chief executive officer of ESource. He has more than 35 years of industry experience and is known for his focus on providing value to customers. Prior to joining ESource, he served as CEO of TROVE Predictive Data Science, senior vice president of utility solutions at Ecova, and vice president of marketing and energy efficiency at Duke Energy. Ted has held several advisory board positions at the national level, including with the Edison Foundation, National Action Plan for Energy Efficiency, EPRI Energy Services, and JD Power Smart Grid Services. He currently serves on the Marketing Executives Conference.

Continue reading here:

Data, Decarbonization, and The New Math of Sustainability - Transmission & Distribution World

Read More..

Why Big Data Analytics, AI/ML will be the Most In-Demand Skills in India in 2022? – Analytics Insight

Why Big Data Analytics, AI/ML will be the Most In-Demand Skills in India in 2022?

The need for skilling and upskilling reached a new high amid the pandemic and in 2022, big data analytics, along with AI/ML, are reportedly to be the most in-demand skills in India. With rapid tech adoption across industries and entirely tech-enabled sectors such as IT and BFSI, the role of AI and Machine learning will only continue to grow in 2022, with a significant increase in the demand for related roles. Industry reports suggest that AI/machine learning investments in India will continue to grow at a CAGR of 33.49 percent till 2023.

To enhance customer engagement, more and more organizations are adopting chatbots which are forecast to empower approximately 45 percent of organizations customer support services by 2022. The future of work is location-agnostic and hybrid, with increased skilling initiatives being undertaken by both employers and employees. Leading tech-enabled industries such as IT, FinTech, BFSI, and crypto will continue to flourish with talent demand spikes. It is also interesting to note that employee flexibility would be critical towards retaining talent in the future, and the Great Shuffle is a reinforcement of how the huge demand in the jobs market is opening the door for employees to select a career of their choice. The Indian fintech market is expanding rapidly and is estimated to become the third-largest market in the world by 2025.

According to the annual trends report, Indian IT has continued to hire through the course of the pandemic and will exhibit similar trends in 2022.

The IT industry is reportedly forecast to grow 7 percent in the current year and is likely to see a gross employee addition of around 450,000 in the second half of FY22. The top skills organizations are on the lookout for are Big data science, cloud computing, artificial intelligence, Blockchain, and machine learning. The demand for sales professionals is estimated to increase especially in industries such as fintech, retail, e-commerce, and social commerce. With a number of employees preferring remote working and staying in their hometowns, more and more organizations are considering setting up smaller offices in tier 2 cities or utilizing co-working spaces to provide employees with better resources and access to technology. This would, in turn, lead to an increased demand to hire across tier 2 cities in the coming months.

Hiring for freshers has picked up over the last three months of the year and is estimated to increase considerably in 2022.

The continuing emergence of startups will further contribute to the demand for entry-level professionals across industries. According to researchers, India has about 6.33 crore micro, small and medium enterprises. The number of registered SMBs grew 18.5% YoY to 25.13 lakh units in 2020, from 21.21 lakh units in 2019. As of 2020, registered SMBs were dominated by micro-enterprises at 22.06 lakh units, over 18.70 lakh in 2019, while small enterprise units went up from 2.41 lakh to 2.95 lakh. Midsized businesses marginally increased from 9,403 units to 10,981 units in the same period. SMBs and MSMEs are currently reported as employing more than 130 million people and contribute about 6.11% to Indias manufacturing GDP, 24.63% of the GDP from service activities as well as 33.4% of manufacturing output.

Share This ArticleDo the sharing thingy

See the article here:

Why Big Data Analytics, AI/ML will be the Most In-Demand Skills in India in 2022? - Analytics Insight

Read More..

23 million to boost skills and diversity in AI jobs – GOV.UK

Funding for conversion courses will help underrepresented groups get tech jobs even if they have no previous experience in the field

Companies encouraged to contribute to funding to boost skills pipeline for future workforce

Up to 23 million in government funding will create more AI and data conversion courses, helping young people from underrepresented groups including women, black people and people with disabilities join the UKs world-leading Artificial Intelligence (AI) industry.

Up to two thousand scholarships for masters AI conversion courses, which enable graduates to do further study courses in the field even if their undergraduate course is not directly related, will create a new generation of experts in data science and AI.

The UK has a long and exceptional history in AI, from codebreaker Alan Turings early work through to London-based powerhouse DeepMinds pioneering research which will enable quicker and more advanced drug discovery.

AI underpins the apps which help us navigate around cities, stop online banking fraud and communicate with smart speakers.

The UK is ranked third in the world for private venture capital investment into AI companies (2019 investment into the UK reached almost 2.5 billion) and is home to a third of Europes total AI companies.

The new scholarships will ensure more people can build successful careers in AI, create and develop new and bigger businesses, and will improve the diversity of this growing and innovative sector.

DCMS Minister for Tech and the Digital Economy Chris Philp said:

The UK is already a world leader in AI. Today were investing millions to ensure people from all parts of society can access the opportunities and benefits AI is creating in this country.

We are doubling the number of AI scholarships previously available to underrepresented groups to build a diverse and inclusive workforce fit for the future of tech.

Science Minister George Freeman said:

The UK is one of the worlds most advanced AI economies, with AI already playing a key role in everything from climate science and medical diagnostics to factory robotics and smart cities.

It is therefore essential that we continue to equip our workforce with the skills they need in this important technology, while also making the industry accessible to talented people from all backgrounds.

Thats why todays significant funding award is so important, which will see underrepresented groups trained up to build successful careers in AI, supporting the growth and diversity of AI in the UK.

UK AI Council Chair Tabitha Goldstaub said:

Ensuring there is a diverse talent pool furthering AI developments is vital for the success of the technology and society.

These masters conversion courses welcoming people from non-STEM degrees attract a less homogeneous group which means the UK AI ecosystem benefits from graduates with different backgrounds, perspectives, and life experiences.

Obum Ekeke OBE, Head of Education Partnerships, DeepMind said:

The next generation of AI researchers must be representative of the world around us if AI is to benefit the whole of society.

These scholarships will help to build a stronger and more diverse AI community in the UK, and bring a wider range of experiences - as well as valuable multidisciplinary expertise - to the field.

The government is encouraging companies to play their part in creating a future pipeline of AI talent by match-funding the AI scholarships for the conversion courses. Industry support would get more people into the AI and data science job market quicker and strengthen their businesses.

An independent organisation, to be announced later this year, will be responsible for encouraging industry participation and investment into the AI Scholarships scheme.

First round of AI courses a roaring success

The first stage of the AI conversion courses in 2019, delivered by the Office for Students, supported 28 universities in England to set up and provide degree conversion courses in AI and data science.

The 13.5 million AI Scholarship programme enabled a diverse group of students to study AI and data science, as 76 per cent of scholarship students were women, whereas only 30 per cent of masters students on traditional computing courses are women. Nearly half, 45 per cent, of the scholarship students were black and 24 per cent had disabilities, boosting representation on the AI and data science courses.

The first phase of the AI courses also attracted tech talent outside of London and the South East, with 70 per cent of the total students and 84 per cent of the scholarship students based outside of these areas in England, levelling up AI and data science skills.

The programme is part of the governments National AI Strategy, which commits to investing in and planning for the AI ecosystem over the next ten years, to boost the UKs leadership as a global science and AI superpower.

The new scholarships follow on from the Industrial Funded AI Masters programme supported by the government since 2019 to increase AI skills across the UK with industry investment.

John Blake, Director for Fair and Access and Participation at the Office for Students said:

The postgraduate conversion courses offer a valuable opportunity for students from all backgrounds to contribute fresh perspectives and innovation to data science and artificial intelligence. In turn, these courses provide an important opportunity for organisations of all sizes to address the digital skills gap and support the post-pandemic recovery right across the country.

The enrolment data for the first year of the programme indicates that the courses are contributing to changes in the tech industry towards a more diverse workforce. I look forward to the next phase of the programme and seeing how universities and organisations are collaborating to support access for underrepresented students, and the subsequent successes of students as they continue to develop their learning and experiences in this crucial industry.

ENDS

Notes to editors An open competition will be held later this year for universities in England to bid for the scholarships. The next round of the AI scholarship scheme will start in April 2023 and will be available until 2025.

At the 2021 Conservative party conference in Manchester the Chancellor confirmed the creation of 2,000 AI scholarships, adding to the existing 1,000 AI and data science conversion course scholarships.

Courses are inclusive by design providing students with flexible access to study (e.g. evenings, remote options) and course content suitable for non-STEM graduates

The Department for Digital, Culture, Media and Sport (DCMS) and Office for AI are also looking to make more underrepresented groups eligible for the 2,000 AI scholarships.

A broker organisation will be established by April 2022 to support industry investment into this programme and provide information to industry across AI skills initiatives. If you would like to support the programme please contact AIskills@officeforai.gov.uk.

The Office for Students has developed a series of case studies about the students receiving scholarships, highlighting the benefits this investment has brought.

The National AI Strategy aims to secure broad public trust and support, and through the involvement of the diverse talents and views of society, focuses on three pillars:

Read the original:

23 million to boost skills and diversity in AI jobs - GOV.UK

Read More..

Learn quantum computing: a field guide – IBM Quantum

Quantum theory is a revolutionary advancement in physics and chemistrythat emerged in the early twentieth century. It is an elegantmathematical theory able to explain the counterintuitive behavior ofsubatomic particles, most notably the phenomenon of entanglement. Inthe late twentieth century it was discovered that quantum theory appliesnot only to atoms and molecules, but to bits and logic operations in acomputer. This realization has brought about a revolution in thescience and technology of information processing, making possible kindsof computing and communication hitherto unknown in the Information Age.

Our everyday computers perform calculations and process information using thestandard (or classical) model ofcomputation, which dates back toTuring and vonNeumann. In thismodel, all information is reducible to bits, which can take the valuesof either 0 or 1. Additionally, all processing can be performed via simple logicgates (AND, OR, NOT, XOR, XNOR)acting on one or two bits at a time, or be entirely described by NAND (or NOR).At any point in its computation, aclassical computers state is entirely determined by the states of allits bits, so that a computer with n bits can exist in one of2^n possible states, ranging from 00...0 to11...1 .

The power of the quantum computer, meanwhile, lies in its much richerrepertoire of states. A quantum computer also has bits but instead of0 and 1, its quantum bits, or qubits, can represent a 0, 1, or linearcombination of both, which is a property known as superposition.This on its own is no special thing, since a computer whose bits can beintermediate between 0 and 1 is just an analog computer, scarcely morepowerful than an ordinary digital computer. However, a quantum computertakes advantage of a special kind of superposition that allows forexponentially many logical states at once, all the states from|00...0rangle to |11...1rangle . This is a powerfulfeat, and no classical computer can achieve it.

The vast majority of quantum superpositions, and the ones most useful for quantumcomputation, are entangled. Entangled states are states of the whole computerthat do not correspond to any assignment of digital or analog states ofthe individual qubits. A quantum computer is therefore significantly more powerfulthan any one classical computer whether it be deterministic,probabilistic, or analog.

While todays quantum processors are modest in size, their complexity growscontinuously. We believe this is the right time to build and engage a communityof new quantum learners, spark further interest in those who are curious,and foster a quantum intuition in the greater community.By making quantum concepts more widely understood even on a generallevel we can more deeply explore all the possibilities quantumcomputing offers, and more rapidly bring its exciting power to a worldwhose perspective is limited by classical physics.

With this in mind, we created the IBM Quantum Composer to provide the hands-onopportunity to experiment with operations on a real quantum computingprocessor. This field guide contains a series of topicsto accompany your journey as you create your own experiments, run them insimulation, and execute them on real quantum processorsavailable via IBM Cloud.

If quantum physics sounds challenging to you, you are not alone. But ifyou think the difficulty lies in hard math, think again. Quantum conceptscan, for the most part, be described by undergraduate-level linear algebra,so if you have ever taken a linear algebra course, the math will seem familiar.

The true challenge of quantum physics is internalizing ideas that arecounterintuitive to our day-to-day experiences in the physical world,which of course are constrained by classical physics. To comprehendthe quantum world, you must build a new intuition for a set of simple butvery different (and often surprising) laws.

The counterintuitive principles of quantum physics are:

1.A physical system in a definite state can still behaverandomly.

2.Two systems that are too far apart to influence each other cannevertheless behave in ways that, though individually random,are somehow strongly correlated.

Unfortunately, there is no single simple physicalprinciple from which these conclusions follow and we must guard againstattempting to describe quantum concepts in classical terms!The best we can do is to distill quantum mechanics down to a fewabstract-sounding mathematical laws, from which all the observed behaviorof quantum particles (and qubits in a quantum computer) can be deduced andpredicted.

Keep those two counterintuitive ideas in the back of your mind, let goof your beliefs about how the physical world works, and begin exploringthe quantum world!

Read the original:
Learn quantum computing: a field guide - IBM Quantum

Read More..

The Inside Of A Black Hole Deciphered With Quantum Computing – Optic Flux

A physicist at the University of Michigan is utilizing quantum computing plus machine learning to help comprehend the concept of holographic duality.

Holographic duality is a mathematical concept that links particle theories and interactions with gravity theory. This conjecture implies that the theories of gravity and particles are mathematically analogous: whatever happens in the theory of gravity also occurs in the theory of particles, as well as vice versa.

Both theories explain various dimensions, but they differ by one in the number of dimensions they describe. So, for instance, gravity manifests in three dimensions inside the geometry of a black hole, but particle physics lives in two dimensions on its surface, a flat disk.

Our entire universe, according to some scientists, is indeed a holographic projection of particles, which might lead to a consistent quantum explanation of gravity.

Rinaldi and his co-authors look at ways to study holographic duality utilizing quantum computing & deep learning to determine the lowest energy state of mathematical equations termed quantum matrix models in a paper published in the journal PRX Quantum.

Researchers are aiming to determine the precise arrangement of particles in the system that represents the systems lowest energy state, termed the ground state, when they solve matrix models like these. Nothing changes inthe system in its natural condition unless you add anything to it that causes it to be perturbed.

The researchers in Rinaldis work define the quantum wave function, which is a mathematical representation of the quantum state of their matrix model. Then they utilize a unique neural network to determine the matrixs wave function with the least amount of energy, its ground state. The ground state from both matrix models studied by the researchers was discovered, although quantum circuits are constrained by a limited number of qubits.

The findings, according to Rinaldi, set a high bar for future research on quantum and machine learning algorithms that could be used to examine quantum gravity using the concept of holographic duality.

The research was published in the journal PRX Quantum.

Go here to see the original:
The Inside Of A Black Hole Deciphered With Quantum Computing - Optic Flux

Read More..

University of Strathclyde Will Lead Two Quantum Programs with a Total Budget of 960000 ($1.3M USD) – Quantum Computing Report

University of Strathclyde Will Lead Two Quantum Programs with a Total Budget of 960,000 ($1.3M USD)

The first program is anInternational Network in Space Quantum Technologies which will include a consortium of 37 members in 13 countries, including four industrial partners, to develop satellite-enabled quantum-secure communication and Earth observation. It will tackle the technical challenges of putting quantum technology into space including the radiation environment in space, autonomous and remote operation and the limited size, weight and power constraints of satellites. This program will be funded by the UKsEngineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI) with a budget of 480,000. The second program is anInternational Network for Microfabrication of Atomic Quantum Sensors. This program will develop the next generation of miniaturized quantum sensors, with potential applications in healthcare, navigation, finance, communication and security. It is also funded by UKRI with a budget of 480,000. Additional information about these two quantum related awards is available in a news release on the University of Strathclyde website here.

February 14, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Read the original here:
University of Strathclyde Will Lead Two Quantum Programs with a Total Budget of 960000 ($1.3M USD) - Quantum Computing Report

Read More..

Sollensys Corp Quantum Computing Threat: The Best Offense is an Even Better Defense – Digital Journal

Sollensys Corp. has announced the completion of a preliminary validation test of a blockchain verification algorithm which is resistant to an attack by even the most sophisticated quantum computer

PALM BAY, FL / ACCESSWIRE / February 15, 2022 / Sollensys Corp (OTC PINK:SOLS), one of the first cybersecurity companys using blockchain, has announced the completion of a preliminary validation test of a blockchain verification algorithm which is resistant to an attack by even the most sophisticated quantum computer. The validation test, which utilized a form of lattice-based cryptography, was performed using a Sollensys Blockchain Archive ServerTM to secure a data cache of the size typically generated on a daily basis by a large banking corporation.

Read the Sollensys white paper, The Impending Threat of Quantum Computing on Blockchain Security, now.

Although quantum computing is not yet developed to a high technology readiness level, neither are the defenses against the algorithms that quantum computation promises. Recent advances in quantum computer hardware capabilities suggest a rapidly shifting landscape in which quantum computing poses a systemic threat to blockchain integrity and security viability.

Today the Sollensys double blockchain technology represents the industry-leading approach for data security. Period. However, the rapidly evolving capability posed by quantum computers, especially if exploited by nefarious actors, is an emerging and salient threat for tomorrow, said Don Beavers, CEO of Sollensys Corp. In order to proactively thwart such vulnerabilities, Sollensys is developing a portfolio of algorithms which are verifiably resistant to an attack by a quantum computer. These new capabilities will seamlessly integrate into our double blockchain protocols as an option for our clients who continue to require the highest levels of ongoing data security. Further, the development of cryptographic algorithms which are quantum resistant has been determined to be a national priority and Sollensys intends to be a leader in this space.

The new suite of quantum resistant cryptographic algorithms will be offered as an option on other Sollensys data security products.

ABOUT SOLLENSYS CORP

Sollensys Corp is a math, science, technology, and engineering solutions company offering products that ensure its clients data integrity through collection, storage, and transmission. Our innovative flagship product is the Blockchain Archive Server, a turnkey, off-the-shelf, blockchain solution that works with virtually any hardware and software combinations currently used in commerce, without the need to replace or eliminate any part of the clients data security that is being utilized. The Company recently introduced its second product offering-the Regional Service Center-which offers small businesses the same state of the art technology previously available only to large or very well-funded companies.

The Blockchain Archive Server encrypts, fragments and distributes data across thousands of secure nodes every day, which makes it virtually impossible for hackers to compromise. Using blockchain technology, the Blockchain Archive Server maintains a redundant, secure and immutable backup of data. Redundant backups and the blockchain work together to assure not only the physical security of the database but also the integrity of the information held within.

The Blockchain Archive Server protects client data from ransomware-malicious software that infects your computer and displays messages demanding a fee to be paid for your system to work again. Blockchain technology is a leading-edge tool for data security, providing an added layer of security against data loss due to all types of software specifically designed to disrupt, damage, or gain unauthorized access to a computer system (i.e., malware).

Uniquely, the Blockchain Archive Server is a turn-key solution that can stand alone or seamlessly integrate into an existing data infrastructure to quickly recover from a cyber-attack. The Blockchain Archive Server is a server that comes pre-loaded with the blockchain-powered cybersecurity software, which can be delivered, installed and integrated into a clients computer systems with ease.

For more information please visit: https://www.sollensys.com.

Forward-Looking Statements: Certain information in this press release contains forward-looking statements. All statements other than statements of historical facts included herein are forward-looking statements. In some cases, forward-looking statements can be identified by words such as believe, expect, anticipate, plan, potential, continue or similar expressions. Such forward-looking statements include risks and uncertainties, and there are important factors that could cause actual results to differ materially from those expressed or implied by such forward-looking statements. These factors, risks and uncertainties are discussed in the Companys filings with the Securities and Exchange Commission (the SEC). You should carefully consider these factors, risks and uncertainties described in, and other information contained in, the reports we file with or furnish to the SEC before making any investment decision with respect to our securities. Readers should not place any undue reliance on forward-looking statements since they involve known and unknown, uncertainties and other factors which are, in some cases, beyond the Companys control which could, and likely will, materially affect actual results, levels of activity, performance or achievements. Any forward-looking statement reflects the Companys current views with respect to future events and is subject to these and other risks, uncertainties and assumptions relating to operations, results of operations, growth strategy and liquidity. The Company is under no obligation (and expressly disclaim any such obligation) to publicly update or revise these forward-looking statements for any reason, or to update the reasons actual results could differ materially from those anticipated in these forward-looking statements, even if new information becomes available in the future.

Investor Relations:

Sollensys Corp866.438.7657www.sollensys.com[emailprotected]https://business.facebook.com/Sollensys/https://www.linkedin.com/company/sollensys-corp/

SOURCE: Sollensys Corp

View source version on accesswire.com: https://www.accesswire.com/688818/Sollensys-Corp-Quantum-Computing-Threat-The-Best-Offense-is-an-Even-Better-Defense

Read more:
Sollensys Corp Quantum Computing Threat: The Best Offense is an Even Better Defense - Digital Journal

Read More..

Quantum and the art of noise – ComputerWeekly.com

Noise, huh, whats it good for? Absolutely nothin. Apart from the geniuses trying to further the advancement of noisy intermediate quantum computing (Nisq), noise means errors. Lowering the error rate in this emerging area of computing requires significantly more physical qubits for every useful logical qubit.

Computer Weekly recently spoke to a number of experts in the field of quantum computing and a picture is emerging of quantum computing, which illustrates the efforts going into making something practical, out of a technology that few truly understand. It promises so much. Imagine being able to solve problems in a way that is simply impossible with existing high performance computing. By being able to simulate chemistry at the quantum level, a quantum computer opens up huge opportunities in material science and a way to control chemical reactions in industrial processes to achieve outcomes such as reducing harmful emissions and waste or improving yield.

One of the new companies trying to make the most of existing tech is Algorithmiq. Its co-founder and CEO Sabrina Maniscalco, believes that full tolerance in quantum computing will require technical advances in manufacturing and may even require fundamental principles to be discovered because, as she says: The science doesnt exist yet. Her company has just received funding to help it develop algorithms for the pharmaceutical sector that can cope with todays noisy quantum computers.

Many of the labs running quantum computing systems, need to operate at close to absolute zero (-273 degrees celsius) to form superconducting qubits. But this level of cooling is not particularly scalable, so one of the on-going areas of research is how to achieve quantum computing at room temperature. This is the realm of the trapped ion quantum computer, and requires an entirely different approach. Winfried Hensinger, chief scientist at Universal Quantum, a spin out from Sussex University, believes that trapped ion quantum computers are more resilient to noise. He says: The ion is naturally much better isolated from the environment as it just levitates above a chip.

Another startup, Quantum Motion, spun out of UCL, is looking at how to industrialise quantum computing by being able to measure the quantum state of a single electron in a silicon transistor. Significantly, this transistor can be manufactured using the same chip fabrication techniques that are used in the manufacture of microprocessors.

These three examples represent a snapshot of the level of ingenuity that is being poured into quantum computer research. A universal quantum computer may be years off, but something usable and scalable is almost within earshot.

Continued here:
Quantum and the art of noise - ComputerWeekly.com

Read More..