Page 2,728«..1020..2,7272,7282,7292,730..2,7402,750..»

Data Mining – Definition, Applications, and Techniques

Data mining is the process of uncovering patterns and finding anomalies and relationships in large datasets that can be used to make predictions about future trends. The main purpose of data mining is to extract valuable information from available data.

Data mining is considered an interdisciplinary field that joins the techniques of computer science and statisticsBasic Statistics Concepts for FinanceA solid understanding of statistics is crucially important in helping us better understand finance. Moreover, statistics concepts can help investors monitor. Note that the term data mining is a misnomer. It is primarily concerned with discovering patterns and anomalies within datasets, but it is not related to the extraction of the data itself.

Data mining offers many applications in business. For example, the establishment of proper data (mining) processes can help a company to decrease its costs, increase revenuesRevenueRevenue is the value of all sales of goods and services recognized by a company in a period. Revenue (also referred to as Sales or Income), or derive insights from the behavior and practices of its customers. Certainly, it plays a vital role in the business decision-making process nowadays.

Data mining is also actively utilized in finance. For instance, relevant techniques allow users to determine and assess the factors that influence the price fluctuations of financial securitiesMarketable SecuritiesMarketable securities are unrestricted short-term financialinstrumentsthat are issued either forequity securities or for debt securities of a publicly listed company. Theissuing company creates theseinstruments for the express purpose of raising funds to further finance business activities andexpansion..

The field is rapidly evolving. New data emerges at enormously fast speeds while technological advancements allow for more efficient ways to solve existing problems. In addition, developments in the areas of artificial intelligence and machine learning provide new paths to precision and efficiency in the field.

Generally, the process can be divided into the following steps:

The most commonly used techniques in the field include:

CFI offers the Business Intelligence & Data Analyst (BIDA)Become a Certified Business Intelligence & Data Analyst (BIDA)From Power BI to SQL & Machine Learning, CFI's Business Intelligence Certification (BIDA) will help you master your analytical superpowers.certification program for those looking to take their careers to the next level. To keep learning and advancing your career, the following CFI resources will be helpful:

View original post here:

Data Mining - Definition, Applications, and Techniques

Read More..

DATA MINING the TWITTER MINEFIELD Contact tracing RAMPING UP again UNPACKING the BALLOT INITIATIVES – Politico

DATA MINING THE TWITTER MINEFIELD Politics is more than what you see on Twitter. But a new analysis of Boston electeds online activity shows theres plenty of information to be gleaned from politicians feeds and follower lists, especially when it comes to the citys mayoral and council elections.

Legislata, a productivity software for politicians, ran the numbers to rank the top 8,225 accounts followed by city councilors and members of Bostons state delegation.

Rep. Ayanna Pressley, a former city councilor, tops the list. Suffolk District Attorney Rachael Rollins, now the nominee for U.S. attorney for Massachusetts, is next. Shes followed by a tie for third between City Councilor Annissa Essaibi George, former state Sen. Linda Forry, the state Democratic Party, and state Attorney General Maura Healey.

Mayoral hopeful Essaibi George is followed by a higher percentage of Boston electeds than any of her rivals for the citys top job, and even former mayor Marty Walsh. But City Councilors Michelle Wu and Andrea Campbell, and Acting Mayor Kim Janeys campaign account, all have more followers to help broadcast their messages to wider audiences a helpful tool as they fight to secure one of the top two spots in the September preliminary.

There are also clear differences in the candidates messaging. Wu tweets most about climate and community and emphasizes words like bold and change. Campbell frequently uses words like equitable and accountability and often mentions the police. Janey, the first woman and Black person to serve as the citys top executive, focuses on proud and joy and history in her messaging. Essaibi George leans into hashtags like #getbosbacktobiz and #citykidswin, while former city economic development chief John Barros uses affordable most frequently.

Twitter followers dont necessarily equal support. But if the percentage of Boston electeds following certain candidates did translate to turnout, the four at-large council seats up for grabs would likely be won by incumbents Michael Flaherty and Julia Mejia, and newcomers Ruthzee Louijeune and David Halbert, the Legislata analysis shows. That projection actually aligns with a recent analysis of at-large candidates cash flows and citywide networks from Rivera Consulting, which also favored those candidates for the at-large seats.

All politics is local, especially on Boston political Twitter, Legislata founder and CEO Chris Oates said. More city councilors follow the Dorchester Reporter than the Boston Globe, and some of the most followed reporters and columnists are the most hyperlocal, like Dorchester Reporters Bill Forry, Adam Gaffins Universal Hub and Chris Lovett of Boston Neighborhood Network News. See the full list here.

GOOD MORNING, MASSACHUSETTS. Have a tip, story, suggestion, birthday, anniversary, new job, or any other nugget for the Playbook? Get in touch: [emailprotected]

TODAY Rep. Lori Trahan, House Speaker Pro Tempore Kate Hogan and local officials highlight federal investments to support the Assabet River Rail Trail extension at 12:30 p.m. in Stow. Janey hosts a press conference to discuss the latest Covid-19 updates and the launch of a mental health response pilot, 1 p.m. at Boston City Hall. Wu works a shift at Bon Mes food truck at 1:30 p.m. in Dewey Square to raise awareness for supporting small businesses. Campbell is a guest on "The Last Word with Lawrence O'Donnell" on MSNBC.

Massachusetts coronavirus cases spike 962, hospitalizations keep rising, by Rick Sobey, Boston Herald: Massachusetts health officials on Wednesday reported a spike of 962 coronavirus cases, as daily infection counts keep climbing amid the more highly contagious delta variant. The 962 cases is the highest single-day case count in nearly three months.

A message from The Massachusetts Coalition for Independent Work:

83% of app-based drivers in Massachusetts want to remain independent while accessing expanded benefits. The Massachusetts Coalition for Independent Work is dedicated to securing flexibility, providing new benefits and guaranteeing an earnings floor for app-based drivers. Learn more, and join our advocacy for independent workers.

Baker shields tax credits for medical devices, shipping companies, by Christian Wade, CNHI/Gloucester Daily Times: Gov. Charlie Baker is moving to protect tax credits for companies that make medical devices and for importers and exporters who use the states shipping ports. Baker said the tax breaks encourage innovation and economic activity and should be maintained, in a message notifying the Legislature of his veto of an addendum to the state budget that would have ended three tax credits.

State Rep. Williams: Were Going To Push For $1 Billion For Black And Brown Communities, by Edgar B. Herwick III, GBH News: State Representative Bud L. Williams is calling on the state to invest $1 billion of federal funds in Black and Brown communities. This came Wednesday at a virtual roundtable discussion, as state lawmakers are currently working to determine how to distribute some $5.3 billion in federal COVID-19 recovery funds from the American Rescue Plan Act (ARPA).

Baker: All Staff In Long-Term Care Facilities Must Be Vaccinated By Oct. 10, by Meghan B. Kelly, WBUR: The Baker administration announced Wednesday that all staff in long-term care facilities, including skilled nursing facilities and the state's two Soldiers' Homes, must be fully vaccinated against COVID-19 by Oct. 10. The mandate marked Gov. Charlie Baker's first order to require vaccination in any private or public workforce.

Contact tracing ramping up yet again, by Bruce Mohl, CommonWealth Magazine: The states contact tracing effort is ramping up again in the midst of a resurgence in COVID-19 cases. The program was scheduled to shut down in September but instead the contract with the operator, the nonprofit Cambridge-based Partners in Health, is being extended through the end of the year. The number of contract tracers, currently at 130, is also being increased by as much as 300.

Vaccine the latest shot featured at nightclubs, by Jill Harmacinski, Eagle-Tribune: LAWRENCE COVID-19 vaccines have been available at pharmacies, clinics, pop-up sites and even block parties. Now, people can grab a drink and a shot at some local nightclubs. Vaccine availability is being increased across the city with more block parties in parks. And, starting Aug. 13, you can also get a vaccine in a nightclub, Mayor Kendrys Vasquez announced.

Provincetown to certify businesses based on COVID-19 vaccination policies, by Drew Karedes, Boston 25 News: There are three certificates: 1. Venue requires all staff to be vaccinated 2. Venue requires proof of vaccination to enter 3. All venue staff are vaccinated, and proof of vaccine is required to enter.

Boston Superintendent Brenda Cassellius is no longer licensed to run school system, by Andrea Estes and James Vaznis, Boston Globe: Boston Schools Superintendent Brenda Cassellius license to run a school system in Massachusetts has expired because she never took the states certification exams, a turn of events that puts her in violation of her contract. ... The deadline for taking the test and securing a new license was last Saturday. Cassellius apologized to the School Committee Wednesday night after the Globe published a story about her license lapsing, saying it was due to a misunderstanding over licensing deadlines. She told them she is scheduled to take the tests on Aug. 14.

Plans for Boston Methadone Mile hotel housing scuttled in face of community opposition, by Sean Philip Cotter, Boston Herald: The controversial plans to house homeless in a vacant Methadone Mile hotel are dead, the main proponent told locals. Victory Programs had sought to move 14 to 35 people living on the streets in the rough area known as Mass and Cass or Methadone Mile into the hotel at 891 Massachusetts Ave. ... The idea, backed by a state grant, was to use the hotel as transitional housing for people on the streets, stabilizing them before moving them out of the Mile, which continues to worsen as a haven of violence and open-air drug use.

Boston: Only One Mayoral Candidate Says Moving Money From Police To Social Services Is A Top Priority, by Saraya Wintersmith, GBH News: A little more than a year has passed since protesters defied pandemic quarantine orders and spilled into Boston streets, decrying racism and police brutality and calling for defunding the Boston Police Department. Now, with about a month to go until the city's Sept. 14 preliminary election, most mayoral candidates have narrowed their focus to improving the Boston Police Department's response to mental health crises cases. Only one, city councilor Andrea Campbell, is charging full-steam-ahead with a plan to restructure the department and reduce its budget by $50 million, about 12.5% of this year's allocation.

Boston Mayoral Candidates Address City's Racial Wealth Gap, by Anthony Brooks, WBUR: All the major Boston mayoral candidates say they have ambitious plans to take on one of the most stubborn problems facing the city: The glaring racial wealth gap. It also matters that the current field of candidates is historically diverse, which means that when some of them talk about how to close the wealth gap, it's personal.

Child care is now a major political issue. Heres how the Boston mayoral candidates want to reform it, by Stephanie Ebbert, Boston Globe: Four of the five major contenders have presented detailed campaign plans on the issue and all have endorsed the recent recommendations of the Birth to Eight Collaborative, a coalition of parents, nonprofits, schools, and advocates working to ensure all children are prepared to succeed when they enter school.

Campbell escalates COVID-19 clash with Janey, by Danny McDonald and Jasper Goodman, Boston Globe: Councilor Andrea Campbell on Wednesday afternoon ramped up criticism of Acting Mayor Kim Janey over her handling of the pandemic. And a day after invoking slavery and birtherism in response to a question about New York Citys new proof-of-vaccine requirements, Janey stood fast, defending her approach to COVID-19 as sensible and equitable, and emphasizing the importance of the public getting vaccinated."

At-large hopefuls weigh-in on 'vaccine proof' idea, by Katie Trojano, Dorchester Reporter: Acting Mayor Kim Janey has come under fire from mayoral rivals after appearing to balk at New Yorks move and comparing requiring the proof to slavery and birtherism. We asked the at-large city council hopefuls to share their stances.

Annissa Essaibi George: The Boston.com interview, by Nik DeCosta-Klipa, Christopher Gavin, and Zipporah Osei, Boston.com: In the crowded field of self-identifying progressive Democrats running to be Bostons next mayor, Annissa Essaibi George is often defined in contrast to her opponents. ... The daughter of immigrants and a lifelong Dorchester native, Essaibi George emphasizes her focus on the most pressing struggles faced by the citys most vulnerable, like homelessness and mental health issues that she saw up close as a former East Boston High School teacher. Id like to be the teacher candidate, Essaibi George says, calling her classroom experience as an invaluable part of her bonafides as a mayoral candidate.

A message from The Massachusetts Coalition for Independent Work:

Some FIRST IN PLAYBOOK endorsements: EMILYs List is endorsing Framingham Mayor Yvonne Spicer and Newton Mayor Ruthanne Fuller in their reelection bids, and is also endorsing Vilma Martinez-Dominguez in the Lawrence mayors race. These strong women leaders have what it takes to confront the challenges in their communities. We know they will continue to work tirelessly on COVID-19 recovery, affordable housing initiatives, infrastructure modernization, and investments in local schools, EMILYs List said in a statement.

EMILYs List is also endorsing four women seeking reelection to the Boston City Council: District 1 Councilor Lydia Edwards, District 8 Councilor Kenzie Bok, District 9 Councilor Liz Breadon and at-large Councilor Julia Mejia.

Boston City Council at-large candidate Ruthzee Louijeune has been endorsed by United Auto Workers Region 9A, per her campaign. We are inspired by Ruthzees campaign of progressive values and inclusive style of leadership, UAW Region 9A regional director Beverley Brakeman said in a statement.

District 3 City Councilor Frank Baker has endorsed Bridget Nee-Walsh for Boston city councilor at-large, per her campaign. Working families can trust her to fight for good jobs with good pay and benefits, strong public schools, affordable housing, and public safety, Baker said in a statement.

IBEW Local 2222 and Mass Voters for Animals have endorsed Mary Tamer for Boston City Council District 6, per her campaign.

Whats At Stake In Somervilles Mayoral Race? by Adam Reilly, GBH News: When it comes to who, exactly, calls Somerville home, the city has been changing for years, with Asian, Latino and Black residents comprising an increasingly large portion of the citys population of 80,000. Still, Somervilles mayors have always been white a streak Will Mbah hopes to break this fall. Current mayor Joseph Curtatone, who took office in 2004, decided not to seek a tenth term this November.

Twenty-eight proposed laws and two constitutional amendments have now been filed with the state attorney generals office, the first step in a lengthy and costly process to advance ballot initiatives that touch on everything from legalizing the sale of consumer fireworks, to voter identification laws, to classifying gig-economy workers as independent contractors.

The Boston Globes Emma Platoff details the Massachusetts GOPs effort to test opposition to critical race theory. The state GOP is also backing measures to preserve the lives of children born alive and to require voters to present identification at the polls, a topic that drew multiple petitions.

The Boston Heralds Erin Tiernan reports on a ballot question that would attempt to roll back the contentious Transportation Climate Initiative. That petition was filed by Republicans including GOP gubernatorial candidate Geoff Diehl, and Democratic state Rep. Colleen Garry. Its also backed by the conservative-leaning Massachusetts Fiscal Alliance.

Another initiative petition aims to bring back some long-banned happy hour drink specials a topic of recently renewed debate.

Most of the proposed questions are unlikely to make it to the 2022 ballot, reports CommonWealth Magazine, noting that even if they pass muster on constitutional grounds with Attorney General Maura Healeys office, they would still require the gathering of more than 93,000 voter signatures, a time-consuming and expensive process. The AG's office plans to publish a list of certified petitions on Sept. 1.

Speaking of the 2022 ballot: Watchdogs blast Massachusetts millionaire tax proposal as state awash in excess tax revenues, by Erin Tiernan, Boston Herald: Fiscal watchdogs blasted the Massachusetts millionaire tax proposal as the state is awash in more than $5 billion in excess tax revenues and billions more in federal coronavirus relief."

About the 2020 ballots: Geoff Diehl calls for forensic audit of possible irregularities in Massachusetts 2020 election, by Erin Tiernan, Boston Herald: Republican candidate for governor Geoff Diehl has challenged Gov. Charlie Baker to conduct a forensic audit to investigate possible irregularities during last years election. Diehl also pledged to combat the extension of mail-in voting parameters.

Sens. Elizabeth Warren, Ed Markey and Martin Heinrich (D-N.M.) are seeking a national memorial day to commemorate the more than 614,000 people who have died from Covid-19 in the United States and those still suffering from the virus. The senators introduced a resolution yesterday proposing that the first Monday in March be designated as Covid-19 Victims and Survivors Memorial Day. More from the Washington Posts William Wan.

Trahan touts 'forever chemicals' cleanup bill, by Christian M. Wade, CNHI/Eagle-Tribune: Communities would receive money to help clean up the forever chemicals contaminating their drinking water and sewage treatment systems under a plan working its way through Congress.

Mexico sues U.S.-based gunmakers over flow of arms across border, by Mary Beth Sheridan and Kevin Sieff, Washington Post: The Mexican government sued several major U.S.-based gun manufacturers Wednesday, alleging that lax controls contribute to the illegal flow of weapons over the border. The unusual suit filed in U.S. federal court in Boston seeks unspecified financial compensation from the companies but does not target the U.S. government.

A message from The Massachusetts Coalition for Independent Work:

The Massachusetts Coalition for Independent Work is dedicated to securing flexibility in scheduling, providing new benefits including healthcare stipends, paid sick time, paid family & medical leave and occupational accident insurance and guaranteeing an earnings floor for all app-based drivers in Massachusetts.

We're banding together with drivers, community partners and elected officials to protect the flexibility and independence that drivers value, while expanding their benefits.

Learn more, and join our advocacy for independent workers.

Chelsea, Revere and Winthrop Investigate How Climate Change Impacts Most Vulnerable Populations, by Phillip Martin and Hannah Reale, GBH News: Chelsea, Revere and Winthrop are launching a cooperative project to understand how climate change will specifically affect low income residents, people of color and other vulnerable residents. Ultimately, the aim is to find gaps in the regions approach to combating climate change, centered first and foremost around the communities likely to be most affected by it, and then form recommendations about how to take them on.

Safety Steps Required of Donors To Attend Baker Fundraiser, by Colin A. Young, State House News Service (paywall): Anyone attending the outdoor fundraiser on Sept. 2 for [Gov. Charlie] Baker and Lt. Gov. Karyn Polito at public relations maven George Regan's home at the Willowbend Country Club in Mashpee is asked to be vaccinated against COVID-19 or to get tested for the virus 48 hours ahead of the bash.

Maybe he heard Baker: Obama Significantly Scales Back 60th Birthday Party as Virus Cases Rebound, by Annie Karni, New York Times: The party plans had been months in the making and many invitees had already arrived on Marthas Vineyard when former President Barack Obama belatedly announced he was canceling his huge 60th birthday bash scheduled for Saturday."

Former Gov. Deval Patrick On Cuomo Allegations: There Does Have To Be A Reckoning, by Greater Boston staff: [Former Gov. Deval] Patrick did not call outright for Gov. Cuomo to resign, but said the allegations should be taken seriously. These are very very serious charges from a very credible source, he said. There does have to be a reckoning. If my opinion counts for anything, hell take these allegations seriously and not dismiss them out of hand.

TRANSITIONS Robyn Kenney joins the Diehl campaign as communications director. Janey appointed Dr. Alison Brizius as commissioner for Bostons Environment Department.

HAPPY BIRTHDAY to former ambassador, Biden deputy campaign manager, MA-03 candidate and current chief of protocol nominee Rufus Gifford; Christina Pacheco, and Jim Puzzanghera of the Boston Globes D.C. bureau.

Want to make an impact? POLITICO Massachusetts has a variety of solutions available for partners looking to reach and activate the most influential people in the Bay State. Have a petition you want signed? A cause youre promoting? Seeking to increase brand awareness among this key audience? Share your message with our influential readers to foster engagement and drive action. Contact Jesse Shapiro to find out how: [emailprotected].

Originally posted here:

DATA MINING the TWITTER MINEFIELD Contact tracing RAMPING UP again UNPACKING the BALLOT INITIATIVES - Politico

Read More..

Mining its own data, CityBldr builds tool to show cities the best places to build affordable housing – GeekWire

Blue patches represent publicly owned land in Seattle that could support housing. (CityBldr Graphic)

Engineers at CityBldr knew they were sitting on a goldmine of zoning and land data. After all, thats how the Bellevue, Wash.-based big data company helps large corporations know the most cost-effective way to expand operations.

Then five years ago, staff members realized the data might lend itself toward solving one of the biggest social problems in the urban United States: affordable housing. The same information that could show a company where to build its next warehouse could also show a housing nonprofit or a city planner the entire inventory of underutilized, publicly owned land.

Moreover, it could immediately show them how many people could be housed on each parcel under existing zoning. Today, the company is launching a free demonstration website called Public to show in a sharply limited-access way what the software can do.

And with the launch, CityBldr is kicking off a campaign to get corporations to sponsor affordable housing nonprofits in order to get full access to the complete data that could reveal anything from low-hanging housing fruit to the underpinning of a long-term housing plan.

We spent five years building the Rosetta Stone of zoning, said Bryan Copley, CEO and co-founder of CityBldr. And we think it can really help change the amount of housing available.

That data could come in handy in Seattle come November. Should the Compassion Seattle Initiative get voter approval in the next election, the city will be required to build 2,000 units of housing over the following two years.

Copley said CityBldr has compiled a vast and deep land database of 100 U.S. cities with 255 different zoning standards. In those cities, Public data can show everything from land valuation, parcel size, current zoning, what currently is on the land, and how many people could be housed on the land under existing regulations.

For land that has multi-use zoning, a user can click through parameters for single-family, townhome, or multi-family dwellings to find out how many people or units the land could legally hold. A city planner could find out in minutes how many additional people could be housed on all available public land within the city limits.

After consulting with urban planning experts at U.C. Berkeley, M.I.T. and Harvard University the Harvard expert researched how best to help cities with the data Copley said the search was restricted to publicly owned land for two reasons: it can be easier to get a city to unload underused land to a nonprofit and cities sometimes dont have a simple way to track their own land inventories.

That said, the database someday could be opened to privately owned land as well, he said.

We built it so people could make use of it, Copley said. You cant make a private individual sell. Some people just want to sit on the land. But publicly owned land can be different.

Copley said CityBldr representatives have spoken to housing officials and government leaders in Seattle and across the country and the reception has been enthusiastic. He said the cost of getting the data for each city will run $10,000 so that is what a corporation will pay to sponsor a nonprofit.

Ideally, he said, CityBldr wont make a dime on Public. The plan is to collect money to pay for five full-time staff to help housing nonprofits and cities wade through and understand the data while continually updating the database as local regulations and land inventories change.

The dream, Copley said, is to make this zero out, cost wise. Were not doing this for the money.

Read the rest here:

Mining its own data, CityBldr builds tool to show cities the best places to build affordable housing - GeekWire

Read More..

Zero-carbon bitcoin? The owner of a Pennsylvania nuclear plant thinks it could strike gold – The Philadelphia Inquirer

Could bitcoin mining be the salvation of the embattled nuclear energy industry in America?

The owners of several nuclear power plants, including two in Pennsylvania, have formed ventures with cryptocurrency companies to provide the electricity needed to run computer centers that mine bitcoin. Since nuclear energy does not emit greenhouse gases, the projects investors say, the zero-carbon bitcoin would address climate concerns that have tarnished the energy-intensive cryptocurrency industry.

Talen Energy, the owner of the Susquehanna Steam Electric Station near Berwick, Pa., announced this week that it has signed a deal with TeraWulf Inc., an Easton, Md. cryptocurrency mining firm, to build a giant bitcoin factory next to its twin reactors in northern Pennsylvania. The first phase of the venture, dubbed Nautilus Cryptomine, could cost up to $400 million.

Talens project could eventually use up to 300 megawatts or 12% of Susquehannas 2,500 MW capacity. Its the second bitcoin-mining venture in the last month that involves owners of Pennsylvania nuclear facilities.

Last month Energy Harbor Corp., the former power-generation subsidiary of First Energy Corp., announced it signed a five-year agreement to provide zero-carbon electricity to a new bitcoin mining center operated by Standard Power in Coshocton, Ohio. Energy Harbor owns two nuclear units in Ohio and the twin-unit Beaver Valley Power Station in Western Pennsylvania.

A nuclear fission start-up, Oklo, also announced last month it signed a 20-year deal with a bitcoin miner to supply it with power, though the company has not yet built a power plant.

In recent years, commercial nuclear operators have struggled to compete in competitive electricity markets against natural gas plants and upstart renewable sources such as wind and solar. Unfavorable market conditions have hastened the retirements of several single-unit reactors, such as Three Mile Island Unit 1 in Pennsylvania. Lawmakers in New Jersey, New York and Illinois have enacted nuclear bailouts, paid by electricity customers, to stave off early retirement for other plants.

The cryptocurrency deals would provide nuclear generators with reliable outlets for their power, and bitcoin miners with predictable sources of power at cheap prices, along with a zero-carbon cachet.

Nuclear energy is uniquely positioned to provide power to crypto mining companies and other major energy users who have committed to a carbon-free future, John Kotek, senior vice president of policy development and government affairs at Nuclear Energy Institute, said in an email.

The nuclear industry views the crypto craze not as a crutch but as a launching pad for expansion. U.S. nuclear power plants are ready and able to supply miners with abundant, reliable carbon-free power while also providing new business pathways for the nuclear developers and utilities, increasing their operating profits, and potentially accelerating the deployment of the next generation of reactors, Kotek said.

Nuclear producers arent the only power generators getting in on the trend. Stronghold Digital Mining, a bitcoin miner that registered last month for a $100 million initial stock offering, plans to build its bitcoin mining operation in northwestern Pennsylvania, powered from Venango County waste coal. While its bitcoin would not be zero-carbon, it would reduce environmentally harmful piles of waste coal.

Energy and cryptocurrency experts say several trends are shifting the market in favor of U.S. nuclear power producers.

In May, Chinese regulators announced new measures to limit bitcoin mining in several regions that failed to meet Beijings energy-use targets. Bitcoin production levels have fallen since then, forcing bitcoin producers to relocate to places with low operating costs and cool climates to reduce the costs of cooling the bitcoin data centers. The state of Washington, which has lots of inexpensive hydroelectric power, has undergone a huge boom in bitcoin mining.

Bitcoin is a peer-to-peer virtual currency, operating without a central authority, and which can be exchanged for traditional currency such as the U.S. dollar. It is the most successful of hundreds of attempts to create virtual money through the use of cryptography, the science of making and breaking codes hence, they are called cryptocurrency.

Bitcoin mining is built around blockchain technology, and it involves generating a string of code that decrypts a collection of previously executed bitcoin transactions. Successful decryption is rewarded with a new bitcoin. The supply of bitcoins is limited to 21 million nearly 90% have already been mined. So the remaining bitcoins become increasingly scarce and more difficult to extract.

Data centers operated by bitcoin miners randomly generate code strings, called hashes, to solve the puzzle and earn new coins. Worldwide, miners on the bitcoin network generate more than 100 quintillion hashes per second thats 100,000,000,000,000,000,000 guesses per second, according to Blockchain.com. The first phase of the Nautilus project in Pennsylvania would generate five quintillion hashes per second.

Such guesswork requires muscular computing power, robust internet connections, and lots of electricity. Smaller bitcoin miners have teamed up in consortiums to pool their computing power. Bigger players have built huge data centers devoted exclusively to producing lines of random code.

Mining cryptocurrency is an international, profitable, and energy-intensive business, ScottMadden a management consulting firm, said in a paper it published last year. Bitcoin mining consumes an estimated 0.5% of the electricity produced worldwide or about as much as the country of Greece.

Some lawmakers have called for greater regulation of cryptocurrency, citing the enormous amount of resources required to produce it. There are computers all over the world right now spitting out random numbers around the clock, in a competition to try to solve a useless puzzle and win the bitcoin reward, Sen. Elizabeth Warren (D., Mass.) said in June, calling for a crackdown on environmentally wasteful cryptocurrencies.

But as a business proposition, bitcoin has appeal. ScottMadden, the consulting firm, suggested last year that nuclear operators in some states were in a unique position to profit from cryptocurrency ventures.

Diverting 1 megawatt of power to an efficient mining operation could conservatively generate top-line revenue of $900,000 a year and profits of $650,000, not accounting for cooling, repairs, or technicians, according to ScottMadden. Its analysis predicts that a project could break even in about 15 months.

The consulting firms conceptual project was based upon a bitcoin price of $9,275. The price of a bitcoin last week varied between $38,000 and $42,000.

Such numbers no doubt got the attention of Talen Energy, which plans to divert about 180 MW to the first phase of the Nautilus Cryptomine, which would be producing bitcoin at the Susquehanna plant in Luzerne County.

I think its a great opportunity for our plant, said Dustin Wertheimer, vice president and divisional chief financial officer of Talen Energy. He is based in Allentown, home to Talens previous owner, PPL Corp. Talen is now based in the Woodlands, Texas.

Unlike other crypto projects in which the power generator is an arms-length electricity supplier, the Nautilus Cryptomine is a 50-50 venture between Talen and TeraWulf. The project would be directly connected to the Susquehanna plant behind the meter, in industry parlance and would avoid any transmission costs from the grid.

The direct connection also guarantees that the operation is sourced exclusively with carbon-free energy, Wertheimer said.

Youve seen some of the press and the negative publicity that bitcoin has received recently and the impact of fossil fuel, Wertheimer said. So thats a great thing for us to have a direct connection into a carbon-free power source.

The cryptomine would be located inside a 200,000-square-foot building about four football fields. The mining operation would be built on a data center campus that Talen is developing next to the Susquehanna plant. The data center would generate about 1,000 construction jobs, Wertheimer said. The cryptomine would employ about 50 people to operate.

The first phase of the project would cost about $350 million to $400 million. The Nautilus venture is negotiating with fiber-optic providers to bring in super-charged internet connections required to transmit and receive the huge amounts of code it generates, Wertheimer said.

As you look across the United States, and you look at kind of the challenges that are facing nuclear plants, I think this is a great opportunity to prolong the life of a lot of plants, he said.

The Future of Work is produced with support from the William Penn Foundation and the Lenfest Institute for Journalism. Editorial content is created independently of the projects donors.

Original post:

Zero-carbon bitcoin? The owner of a Pennsylvania nuclear plant thinks it could strike gold - The Philadelphia Inquirer

Read More..

Data Mining Tools Market Will Create Highest Returns by 2027 along with COVID-19 Analysis The Shotcaller – The Shotcaller

The whole situation that determines product demand is covered in this Data Mining Tools marketreport, including constraints, drivers, recent events, restraints, technological innovations, and opportunities for companies. For newcomers to the industry, the present conditions, industrial analysis, and program effectiveness depicted in this Data Mining Tools market report are extremely useful. This Data Mining Tools market report provides an exhaustive current assessment, stating to upcoming approximations and the market setting, to include a comprehensive overview of market evolution. Profitability, industry turnover, and progress are also highlighted in this research. This Data Mining Tools market report also undertakes projects in the area in North America, Latin America, Europe, Asia Pacific, the Middle East, and Africa, among other places.

Get Sample Copy of Data Mining Tools Market Report at:https://www.globalmarketmonitor.com/request.php?type=1&rid=637727

This Data Mining Tools market report sets company objectives to assist industry participants in avoiding assumptions that are incompatible. It gives client data as well as their demands, allowing important industry leaders to plan for the products release in the benefits of economic growth. It contains all of the information concerning the entire market position. The market report contains key evidence and precise data about just the market. It assists organizations to achieve their strategies by supplying all market economic expansion data. This Data Mining Tools market report covers the dealings just as exchanges, which are occurred on the lookout. Subsequently, buyers, venders, providers and customers take the assistance of market report to think about market altogether. It examines about selling and purchasing of the particular item on the lookout.

Key global participants in the Data Mining Tools market include:OracleSAP SESAS InstituteIBM CorporationMicrosoft CorporationIntel

Inquire for a discount on this Data Mining Tools market report at:https://www.globalmarketmonitor.com/request.php?type=3&rid=637727

Market Segments by Application:BFSIHealthcare and Life SciencesTelecom and ITGovernment and DefenseEnergy and UtilitiesManufacturingOthers

On the basis of products, the various types include:On-premisesCloud

Table of Content1 Report Overview1.1 Product Definition and Scope1.2 PEST (Political, Economic, Social and Technological) Analysis of Data Mining Tools Market2 Market Trends and Competitive Landscape3 Segmentation of Data Mining Tools Market by Types4 Segmentation of Data Mining Tools Market by End-Users5 Market Analysis by Major Regions6 Product Commodity of Data Mining Tools Market in Major Countries7 North America Data Mining Tools Landscape Analysis8 Europe Data Mining Tools Landscape Analysis9 Asia Pacific Data Mining Tools Landscape Analysis10 Latin America, Middle East & Africa Data Mining Tools Landscape Analysis11 Major Players Profile

New advances are additionally introduced in this Data Mining Tools market report to get total edge over the rest. Various industry boundaries are additionally concentrated under factual examination in this Data Mining Tools market report. Moreover, it likewise centers around doing examination between various topographical business sectors. It additionally centers around some significant locales of the worldwide market like North America, Europe, Asia Pacific, Latin America and Middle East & Africa. Further it clarifies market pattern of that specific item moreover. It portrays the impacts of wellbeing emergency, COVID-19 on various ventures. Littlest insights concerning market are given to do right interest on the lookout. Realizing clients is the most ideal approach to give them what they need and this Data Mining Tools market report gives exact data about clients. Principle focal point of this market research is to conjecture about market development during the year 2021-2027.

Data Mining Tools Market Intended Audience: Data Mining Tools manufacturers Data Mining Tools traders, distributors, and suppliers Data Mining Tools industry associations Product managers, Data Mining Tools industry administrator, C-level executives of the industries Market Research and consulting firms

The data is emphasized at the national level to show how sales, volume, and earnings differ by location. It illustrates the probable shortfalls and challenges that several major businesses are facing. This study also involves a full analysis of the next price movements from 2021 to 2027, and therefore a compounded calculation of the programs financial budget and profit, as well as important players. With the aid of this inclusive learning, one can voluntarily acquire knowledge about the significances of COVID-19 on industry expansion.

Get More Industry Information on Global Market Monitor:Dental Polishing Lathes Market Reporthttps://www.globalmarketmonitor.com/reports/528570-dental-polishing-lathes-market-report.html

Automotive Catalytic Converters Market Reporthttps://www.globalmarketmonitor.com/reports/618954-automotive-catalytic-converters-market-report.html

Cardiovascular Medical Devices Market Reporthttps://www.globalmarketmonitor.com/reports/563635-cardiovascular-medical-devices-market-report.html

Wire Loop Snare Market Reporthttps://www.globalmarketmonitor.com/reports/505675-wire-loop-snare-market-report.html

Liver Cirrhosis Therapeutics Market Reporthttps://www.globalmarketmonitor.com/reports/544235-liver-cirrhosis-therapeutics-market-report.html

Argon Ion Lasers Market Reporthttps://www.globalmarketmonitor.com/reports/469549-argon-ion-lasers-market-report.html

See the original post:

Data Mining Tools Market Will Create Highest Returns by 2027 along with COVID-19 Analysis The Shotcaller - The Shotcaller

Read More..

What Is Process Mining (and Why Isn’t It Enough)? – IDM.net.au

Businesses live by their processesthe prescribed sets of actions their employees take to get things done. When processes run well, the business runs well. When processes run poorly, the business risks a host of hazards, from loss of revenue to customer dissatisfaction to compliance violations. Most businesses have a general idea of how their processes should run, but lack insight into the day-to-day details of execution. Without this insight, how can they make improvements that yield real results?

Process mining offers one solution, and for many years it served businesses well. However, in todays increasingly complex environment and amid growing pressure to do morefaster and at lower costsorganizations need more intelligent solutions.

In this article, well explore what process mining is, what it can (and cant) do for businesses looking to optimize their processes, and how Process Intelligence offers a more effective approach.

What is process mining

Process mining uses actual data from information systems to create a model that accurately reflects how a process executes.

Applications such as CRM and ERP systems, as well as other systems of record, automatically create event logs that record every action taken. The data in these logs can be collected, or mined, to create an audit trail of the processes the applications are involved in. This works even when multiple applications are used in a single process. Process mining technology follows these audit trails to build a process model showing the details of the end-to-end process, as well as variations. Business users can analyze these models to find out if the processes are functioning as they should and, if not, investigate the root causes of deviations from the optimal path.

How process mining works

Before process mining, the only way for businesses to analyze the performance of their processes was through interviews with business users and manual data reviewsa slow, tedious undertaking with a high margin of error. Process mining allows organizations to leverage automation to paint accurate pictures of real-world process performancefaster, easier, and more accurately than manual approaches.

Where process mining falls short

Process mining offers enormous advantages over manual approaches to process analysis, but it has its limitations. For example:

How Process Intelligence bridges the gap

A new generation of process analytics solutions goes beyond traditional process mining.Process Intelligencecombines BI-like metrics with a set of process-specific analytics to offer detailed insights into complex processes from end to end. Unlike traditional process mining, Process Intelligence enables businesses to view their processes in real time and analyze patterns that lead to bottlenecks or disruptions.

Here are the top four advantages Process Intelligence offers over traditional process mining:

1. Timeline-based analysis

Process mining uses the schema method of process analysis, which involves converting process data into a flowchart (schema) and then analyzing the flow of all iterations through that schema. The shortcoming of this approach is that few business processes fit into a well organized flowchart. By the time all valid variations to a process are considered, the schema often becomes a tangled mess with limited usefulness.

By contrast, Process Intelligence uses atimeline approach, which creates an unfiltered, unedited history of every process iteration from beginning to end. These timelines are then analyzed so that they can be compared, filtered, searched, aggregated, etc., similarly to how a BI application analyzes records in a table. The timeline approach offers complete visibility into the process from end to end, even when some steps are performed using multiple systems. And the numerical analysis approach of Process Intelligence, as compared to the schema-centric approach of process mining, works equally well on all types of processes, as compared to basic process mining that only works well on processes with little variability in terms of the sequence of steps.

2. Continuous improvement

Traditional process mining is focused on looking at historical data. While this approach can offer valuable insights into what worked well and what didn't, it falls short of offering solutions for present and future iterations.

Process Intelligencemonitors processeswith new data coming inreal and near-real time, watching every iteration and alerting process owners for deviations that could cause delays or other problems. By enabling continuous improvement, Process Intelligence continues to deliver ROI as businesses capitalize on new opportunities to make processes work faster and smarter.

3. Reduces compliance risks

When businesses run traditional process mining applications, users can review the output to identify present and past deviations that could lead to compliance issues. This approach relies on the expertise of the users reviewing the data.

Process Intelligence enables users to define process rules that align with the organizations compliance requirements and to instruct the system to watch for violations. When one or more of those rules is broken, the system alerts users right away, enabling them to take immediate action to rectify the deviation and to ensure that it will not happen again. Process Intelligencealerting rulescan also be defined to call a service when an alert is triggered, to automatically deal with the problem. This capability can mean the difference between discovering an issue just in time, before it affects a business compliance status, and finding it when it is too late to be fixed and has already caused problems elsewhere in the workflowor worse, learning about it after a violation has been reported.

4. RPA enhancements

According to Ernst & Young, between30 and 50 percent of initial robotic process automation (RPA) projects faildue to lack of quantifiable process data. As businesses deploy RPA for more intricate processes in more complex environments, the pressure to deliver positive ROI has increased dramatically, and traditional process mining can offer only limited support in yielding the returns that businesses are seeking.

Fortunately, Process Intelligence can be just as valuable to digital workers (RPA "bots") as it is to human employees. Todays Process Intelligence solutions can includeprocess mining and task mining. As with process mining, task mining is looking for the significant events in a process. Task mining adds the ability to record a users manual actions on their computer to capture manual process steps to be used alongside the steps gleaned from system of record log files. By applying Process Intelligence to manual as well as automated processes, businesses uncover new opportunities to improve RPA results:

Why Process Intelligence is the future of process improvement

For many years, process mining applications served process owners well, saving countless manual hours and helping businesses discover opportunities for improvement. Process Intelligence provides a new approach to process improvement that improves on process mining. Process Intelligence works with all processes, simple and highly variable, manual and automated. Process Intelligence will monitor every process instance as each new step occurs, alerting or even taking automated action whenever a process behavior of interest is seen.

Process Intelligence supports RPA initiatives by identifying good automation candidates and then monitoring and reporting on the process that bots participate in. Process improvement can now reach a new level in delivering on its promises of greater productivity, reduced risk of costly compliance violations, and the streamlined efficiency that can create happier customers, happier employees, and a greater competitive edge.

Ready to leverage the data you already have for business process improvement? Click the button below to get started.

Request trial

See the original post:

What Is Process Mining (and Why Isn't It Enough)? - IDM.net.au

Read More..

Data Mining Software Market is Projected to Increase at a Considerable Rate with COVID-19 Impact The Shotcaller – The Shotcaller

This Data Mining Software market report, further briefs on a wide scope of information for aiding industry players to make its presence in this worldwide market. It likewise catches the impact of monetary set-up on possibilities in key extension sections. This remarkable market study report portrays pertinent market information including new stages, advancements and devices presented on the lookout. This report can be utilized as an ideal apparatus by players to get practical edge over contenders. It likewise guarantees enduring accomplishment to ventures. Also, reliable sources are utilized here to approve and revalidate the data referenced here. Industry based and novel exploration is performed by investigators to give exhaustive data about market advancement.

Get Sample Copy of Data Mining Software Market Report at:https://www.globalmarketmonitor.com/request.php?type=1&rid=642852

Furthermore, the results and information in this Data Mining Software market report were acquired from reputable sources. This market reports coarse data can help you anticipate future revenue and make financial decisions. Market research and extensive market studies are undertaken to provide up-to-date facts on the company situation and industry trends. By offering specifics in the form of compelling data visualization, this market research extends beyond the Markets basic structure. This research study provides a detailed image of prospective growth strategies, restraints, key competitors, period preceding, and market size by region and area for the forecasting period 2021-2027.

Major enterprises in the global market of Data Mining Software include:IBMRapidMinerSalford SystemsAptecoOracleLexalyticsGMDHSAS InstituteUniversity of Ljubljana

Ask for the Best Discount at:https://www.globalmarketmonitor.com/request.php?type=3&rid=642852

On the basis of application, the Data Mining Software market is segmented into:Large EnterprisesSmall and Medium-sized Enterprises (SMEs)

Type Synopsis:Cloud-basedOn-premises

Table of Content1 Report Overview1.1 Product Definition and Scope1.2 PEST (Political, Economic, Social and Technological) Analysis of Data Mining Software Market2 Market Trends and Competitive Landscape3 Segmentation of Data Mining Software Market by Types4 Segmentation of Data Mining Software Market by End-Users5 Market Analysis by Major Regions6 Product Commodity of Data Mining Software Market in Major Countries7 North America Data Mining Software Landscape Analysis8 Europe Data Mining Software Landscape Analysis9 Asia Pacific Data Mining Software Landscape Analysis10 Latin America, Middle East & Africa Data Mining Software Landscape Analysis11 Major Players Profile

This Data Mining Software market report divides the market into several categories, such as segmentation variables and geographic segmentation, which also include North America, Europe, the Middle East, Latin America, and Asia Pacific. It also offers insight on the worldwide economic downturn brought on by the COVID-19 epidemic. This pandemic had a significant impact on both the demand and supply sides, causing significant interruptions in company growth. This Data Mining Software market study will focus on the dangers that businesses face and how to cope with them. It assists you in understanding your companys market position. By giving a prediction for the years 2021-2027, it assists major players in introducing new goods to market. By giving all relevant data on market competitiveness, company methods, important value proposition, and the overall market landscape, market research study allows business operators to stay inventive and up to date. Solid measures offered here will considerably assist new entrants in expanding their market operation.

In-depth Data Mining Software Market Report: Intended AudienceData Mining Software manufacturersDownstream vendors and end-usersTraders, distributors, and resellers of Data Mining SoftwareData Mining Software industry associations and research organizationsProduct managers, Data Mining Software industry administrator, C-level executives of the industriesMarket Research and consulting firms

The negative consequences of the COVID-19 epidemic on nearly all business sectors are also depicted in this Data Mining Software market report. It gives industry participants vital information regarding business limits, such as deal methods, new industry breakthroughs, and evaluating structure. This market study also explains the market scope, development outlook, drivers, component analysis, and overall revenue. Industry trends, tactics, and processes are implemented by market contributors. This Data Mining Software market report is the best guide for new companies since it gives customers knowledge into market conditions, historic deals, new product developments, and pricing strategies. It goes on to classify the marketplace and provide analysis by product category, region, and organization size. This market analysis report explains the market size, future trends, factors, segment evaluation, and emerging markets for the forecast period of 2021-2027. A few key elements are also highlighted here to assist businesses in achieving significant business advantages. Aside from that, the Data Mining Software market report goes on to include important data gathered from trustworthy sources. It does a thorough industry analysis in order to comprehend company strategies and assist business participants in strengthening their position in the global market.

Global Market MonitorOne Pierrepont Plaza, 300 Cadman Plaza W, Brooklyn,NY 11201, USAName: Rebecca HallPhone: + 1 (347) 467 7721Email: info@globalmarketmonitor.comWeb Site: https://www.globalmarketmonitor.com

More:

Data Mining Software Market is Projected to Increase at a Considerable Rate with COVID-19 Impact The Shotcaller - The Shotcaller

Read More..

The global graph database market size to grow from USD 1.9 billion in 2021 to USD 5.1 billion by 2026, at a Compound Annual Growth Rate (CAGR) of…

during the forecast period. Various factors such as. need to incorporate real-time big data mining with visualization of results, increasing adoption for AI-based graph database tools and services to drive market, and growing demand for solutions that can process low-latency queries are expected to drive the adoption of graph database solutions and services.

New York, Aug. 06, 2021 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Graph Database Market with COVID-19 Impact Analysis, By Type, Application, Component, Deployment Mode, Vertical And Region - Global Forecast to 2026" - https://www.reportlinker.com/p05436929/?utm_source=GNW

COVID-19s global impact has shown that interconnectedness plays an important role in international cooperation. As a result, several governments started rushing toward identifying, evaluating, and procuring reliable solutions powered by AI. graph databases and AI are invaluable to organizations managing uncertainty in real-time, but most predictive models rely on historical patterns. The use of graph database and AI has accelerated in the COVID-19 pandemic period. This has helped organizations engage customers through digital channels, manage fragile and complex supply chains, and support workers through disruption to their work and lives. New practices, such as work from home and social distancing, have led to the requirement of graph database solutions and services and the development of digital infrastructures for large-scale technology deployments. The COVID-19 pandemic in 2020 brought accelerating changes in consumer preferences and behaviors and putting pressure on brands to keep pace and provide a personalized customer experience. Enterprises have witnessed a reduction in their operational spending and are now focusing more on business continuity and sustainability.

Technology and service providers have been facing significant disruption to their businesses from COVID-19. Hence, the COVID-19 pandemic has disrupted the global financial markets and has created panic, uncertainty, and distraction in the operations of global corporations.

The cloud segment to have the largest CAGR during the forecast periodBy deployment mode, the graph database market has been segmented into on-premises and cloud.The CAGR of the cloud deployment mode is estimated to be the largest during the forecast period.

The cloud-based deployment helps businesses more efficiently process and report data findings, enhance collaboration, and enable decision-makers to get faster access to business intelligence leading to its higher adoption in the graph database market.

The SMEs segment to hold higher CAGR during the forecast periodThe graph database market has been segmented by organization size into large enterprises and SMEs.The market for SMEs is expected to register a higher CAGR during the forecast period as cloud-based solutions and services help them improve business performance and enhance productivity.

Whereas the large enterprises segment is expected to hold a larger market share in the graph database market during the forecast period due to the affordability and the acceptance of emerging technologies.

Among regions, APAC to hold higher CAGR during the forecast periodAPAC is expected to grow at a good pace during the forecast period.Opportunities for smaller graph database vendors to introduce graph database solutions for numerous sectors have also increased.

All these factors are responsible for the expeditious growth of the graph database market in the region.Companies operating in APAC continue to focus on improving customer services to drive market competitiveness and revenue growth.

China, Japan, and AnZ have displayed ample growth opportunities in the graph database market.

Breakdown of primariesIn-depth interviews were conducted with Chief Executive Officers (CEOs), innovation and technology directors, system integrators, and executives from various key organizations operating in the graph database market. By Company: Tier I: 35%, Tier II: 45%, and Tier III: 20% By Designation: C-Level Executives: 35%, D-Level Executives: 25%, and Managers: 40% By Region: APAC: 30%, Europe: 30%, North America: 25%, MEA: 10%, Latin America: 5%The report includes the study of key players offering graph database solutions and services.It profiles major vendors in the global graph database market.

The major vendors in the global graph database market include Oracle Corporation (US), IBM Corporation (US), Amazon Web Services, Inc. (US), DataStax (US), Ontotext (Bulgaria), Stardog Union (US), Hewlett Packard Enterprise (US), ArangoDB (US), Blazegraph (US), Microsoft Corporation (US), SAP SE (Germany), Teradata Corporation (US), Openlink Software (US), MarkLogic Corporation (US), TIBCO Software, Inc. (US), Neo4j, Inc. (US), GraphBase (Australia), Cambridge Semantics (US), TigerGraph, Inc. (US), Objectivity Inc. (US), Bitnine Co, Ltd. (US), Franz Inc. (US), Redis Labs (US), Graph Story (US), Dgraph Labs (US), Eccenca (Germany), and Fluree (US).

Research CoverageThe market study covers the graph database market across segments.It aims at estimating the market size and the growth potential of this market across different segments, such as components, type, deployment mode, organization size, application, vertical, and region.

It includes an in-depth competitive analysis of the key players in the market, along with their company profiles, key observations related to product and business offerings, recent developments, and key market strategies.

Key Benefits of Buying the ReportThe report would provide the market leaders/new entrants in this market with information on the closest approximations of the revenue numbers for the overall graph database market and its subsegments.It would help stakeholders understand the competitive landscape and gain more insights better to position their business and plan suitable go-to-market strategies.

It also helps stakeholders understand the pulse of the market and provides them with information on key market drivers, restraints, challenges, and opportunities.Read the full report: https://www.reportlinker.com/p05436929/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Story continues

Original post:

The global graph database market size to grow from USD 1.9 billion in 2021 to USD 5.1 billion by 2026, at a Compound Annual Growth Rate (CAGR) of...

Read More..

Latest News AI in Media- Top 8 Thrilling AI-based Movies on Netflix – Analytics Insight

The world is progressing at a high speed especially after the pandemic. With rapid evolutions in technology, there is no doubt that the future of media and technology will be served by Artificial Intelligence. In the era of Hollywood films, Artificial Intelligence, Machine Learning, Data Science are some common terms that are used in the movies. Hollywood movies tend to overemphasize dramatic effects, such as the Terminator movie, where killer robots from SKYNET turn sentient and start destroying humans. We can find some truth in these movies. AI is pictured as a miracle invention, the intelligent machine that can perform critical tasks such as simulations and predictions much better than humans. In the movies, we can see AI as a helper to humans, assisting in all kinds of tasks, from being a helper on space adventures to a lover. Netflix has some amazing collections of AI-based movies which are thrilling and mesmerizing.

Analytics Insight presents you with top Netflix Hollywood movies based on AI.

The film was released in 2015. The film portrays an oppressive future where the police force is mechanized and autonomous robots that monitor the streets and handle the lawbreaker. One of these robots is stolen and is upgraded from ANI to AGI which means it becomes sentient. This robot is named Chappie which has new emotions and he is intended to fight the corruption in society. The film is thrilling and very futuristic.

IMDB- 6.8

The film was reloaded in 2014. This film serves a very different view of AI. in the film there is a pair of quadrilateral robots named TARS and CASE which shockingly takes on a form that has no resemblance to the popular human-like structure as seen in Star Wars and Star Trek. This design of TARS and CASE depicts the design that puts function above humanness, and it constitutes many possibilities of critical designs in robotics and AI.

IMDB- 8.6

The film was released in 2013. It is about a man who falls in love with an artificially intelligent OS named Samantha. Nowadays there are virtual smart assistants in our smartphones, right? But when you watch the movie, you will understand that Samantha, the artificially intelligent OS is more sophisticated than our virtual assistant. Unlike Siri and Google voice assistants which can only do simple tasks such as texting a friend or setting an alarm clock, Samantha has an amazing command of language, emotion, common sense, and can handle critical tasks like downloading millions of books, filtering emails in seconds. This film gives a sight of how voice assistants will be like in the future and how we can even fall in love with them.

IMDB- 8.0

The film released in 2011 shows the stunning power of predictive analytics and its potentiality in the real world. In this film, data analytics and the Moneyball theory were used to select the best team of undervalued players under a minimal budget. A champion baseball team was created through the power of data mining on players. This film portrays the significance of data in decision making and designating the right statistics in predictive modeling.

IMDB- 7.6

The film was released in 2002. It portrays the importance of data science. In this film, there is a team of humans portrayed as data scientists called PreCogs, who have the ability to assume future crimes by exploring a huge amount of data. With this exploration, the visual data is transmuted to PreCrime, a police unit that is sent to prevent the crime. This brings up the idea of how data can be used to do great things in the real world, like preventing disasters and saving millions of lives.

IMDB- 7.6

A 2014 released film features Alan Turing, the creator of Turing Test and the father of modern computers. The film shows how Alan Turing, a legendary mathematician, breaks the enigma. Enigma is a strategic code that the Nazis use to encode their secret messages. Turing decides to develop a computer that is able to carry out complex groupings faster than any human being could. With this computer, Turing paved the way for designing machines as well as imposing the field of cryptography and cybersecurity.

IMDB- 8.0

The film was released in 1994. The AI star in the movie is Lieutenant Commander Data, a sentient, self-aware android who is a senior officer aboard the USS Enterprise. There is an emotion chip that is injected in the AI-powered super brain and thus has the ability to generate human emotions. This makes Data a human-like robot that can feel like humans, but with an advanced brain that can enumerate risks better than any human.

IMDB- 6.6

The film was released in 1982. This is an imaginary film where bioengineered models of humans powered by AI live amidst real humans. The only difference is that they only live for 4 years. This film is followed up by the second part, Blade Runner 2049 which is also a great film.

IMDB- 8.1

Share This ArticleDo the sharing thingy

Read this article:

Latest News AI in Media- Top 8 Thrilling AI-based Movies on Netflix - Analytics Insight

Read More..

Google says it has created a time crystal in a quantum computer, and it’s weirder than you can imagine – ZDNet

Google's scientists now rather excitingly say that their results establish a "scalable approach" to study time crystals on current quantum processors.

In a new research paper, Google scientists claim to have used a quantum processor for a useful scientific application: to observe a genuine time crystal.

If 'time crystal' sounds pretty sci-fi that's because they are. Time crystals are no less than a new "phase of matter", as researchers put it, which has been theorized for some years now as a new state that could potentially join the ranks of solids, liquids, gases, crystals and so on. Thepaper remains in pre-print and still requires peer review.

Time crystals are also hard to find. But Google's scientists now rather excitingly say that their results establish a "scalable approach" to study time crystals on current quantum processors.

SEE: What is quantum computing? Everything you need to know about the strange world of quantum computers

Understanding why time crystals are interesting requires a little bit of background in physics particularly, knowledge of the second law of thermodynamics, which states that systems naturally tend to settle in a state known as "maximum entropy".

To take an example: if you pour some milk into a coffee cup, the milk will eventually dissolve throughout the coffee, instead of sitting on the top, enabling the overall system to come to an equilibrium. This is because there are many more ways for the coffee to randomly spread throughout the coffee than there are for it to sit, in a more orderly fashion, at the top of the cup.

This irresistible drive towards thermal equilibrium, as described in the second law of thermodynamics, is reflective of the fact that all things tend to move towards less useful, random states. As time goes on, systems inevitably degenerate into chaos and disorder that is, entropy.

Time crystals, on the other hand, fail to settle in thermal equilibrium. Instead of slowly degenerating towards randomness, they get stuck in two high-energy configurations that they switch between and this back-and-forth process can go on forever.

To explain this better, Curt von Keyserlingk, lecturer at the school of physics and astronomy at the University of Birmingham, who did not participate in Google's latest experiment, pulls out some slides from an introductory talk to prospective undergraduate students. "They usually pretend to understand, so it might be useful," von Keyserlingk warns ZDNet.

It starts with a thought experiment: take a box in a closed system that is isolated from the rest of the universe, load it with a couple of dozens of coins and shake it a million times. As the coins flip, tumble and bounce off each other, they randomly move positions and increasingly become more chaotic. Upon opening the box, the expectation is that you will be faced with roughly half the coins on their heads side, and half on their tails.

It doesn't matter if the experiment started with more coins on their tails or more coins on their heads: the system forgets what the initial configuration was, and it becomes increasingly random and chaotic as it is shaken.

This closed system, when it is translated into the quantum domain, is the perfect setting to try and find time crystals, and the only one known to date. "The only stable time crystals that we've envisioned in closed systems are quantum mechanical," says von Keyserlingk.

Enter Google's quantum processor, Sycamore,which is well known for having achieved quantum supremacyand is now looking for some kind of useful application for quantum computing.

A quantum processor, by definition, is a perfect tool to replicate a quantum mechanical system. In this scenario, Google's team represented the coins in the box with qubits spinning upwards and downwards in a closed system; and instead of shaking the box, they applied a set of specific quantum operations that can change the state of the qubits, which they repeated many times.

This is where time crystals defy all expectations. Looking at the system after a certain number of operations, or shakes, reveals a configuration of qubits that is not random, but instead looks rather similar to the original set up.

"The first ingredient that makes up a time crystal is that it remembers what it was doing initially. It doesn't forget," says von Keyserlingk. "The coins-in-a-box system forgets, but a time crystal system doesn't."

It doesn't stop here. Shake the system an even number of times, and you'll get a similar configuration to the original one but shake it an odd number of times, and you'll get another set up, in which tails have been flipped to heads and vice-versa.

And no matter how many operations are carried out on the system, it will always flip-flop, going regularly back-and-forth between those two states.

Scientists call this a break in the symmetry of time which is why time crystals are called so. This is because the operation carried out to stimulate the system is always the same, and yet the response only comes every other shake.

"In the Google experiment, they do a set of operations on this chain of spins, then they do exactly the same thing again, and again. They do the same thing at the hundredth step that they do at the millionth step, if they go that far," says von Keyserlingk.

"So they subject the system to a set of conditions that have symmetry, and yet the system responds in a manner that breaks that symmetry. It's the same every two periods instead of every period. That's what makes it literally a time crystal."

SEE:Bigger quantum computers, faster: This new idea could be the quickest route to real world apps

The behavior of time crystals, from a scientific perspective, is fascinating: contrary to every other known system, they don't tend towards disorder and chaos. Unlike the coins in the box, which get all muddled up and settle at roughly half heads and half tails, they buck the entropy law by getting stuck in a special, time-crystal state.

In other words, they defy the second law of thermodynamics, which essentially defines the direction that all natural events take. Ponder that for a moment.

Such special systems are not easy to observe. Time crystals have been a topic of interest since 2012, when Nobel Prize-winning MIT professor Frank Wilczek started thinking about them; and the theory has been refuted, debated and contradicted many times since then.

Several attempts have been made to create and observe time crystals to date, with varying degrees of success. Only last month, a team from Delft University of Technology in the Netherlandspublished a pre-print showing that they had built a time crystal in a diamond processor, although a smaller system than the one claimed by Google.

The search giant's researchers used a chip with 20 qubits to serve as the time crystal many more, according to von Keyserlingk, than has been achieved until now, and than could be achieved with a classical computer.

Using a laptop, it is fairly easy to simulate around 10 qubits, explains von Keyserlingk. Add more than that, and the limits of current hardware are soon reached: every extra qubit requires exponential amounts of memory.

The scientist stops short of stating that this new experiment is a show of quantum supremacy. "They're not quite far enough for me to be able to say it's impossible to do with a classical computer, because there might be a clever way of putting it on a classical computer that I haven't thought of," says von Keyserlingk.

"But I think this is by far the most convincing experimental demonstration of a time crystal to date."

SEE: Quantum computing just took on another big challenge, one that could be as tough as steel

The scope and control of Google's experiment means that it is possible to look at time crystals for longer, do detailed sets of measurements, vary the size of the system, and so on. In other words, it is a useful demonstration that could genuinely advance science and as such, it could be key in showing the central role that quantum simulators will play in enabling discoveries in physics.

There are, of course, some caveats. Like all quantum computers, Google's processor still suffers from decoherence, which can cause a decay in the qubits' quantum states, and means that time crystals' oscillations inevitably die out as the environment interferes with the system.

The pre-print, however, argues that as the processor becomes more effectively isolated, this issue could be mitigated.

One thing is certain: time crystals won't be sitting in our living rooms any time soon, because scientists are yet to find a definitive useful application for them. It is unlikely, therefore, that Google's experiment was about exploring the business value of time crystals; rather, it shows what could potentially be another early application of quantum computing, and yet another demonstration of the company's technological prowess in a hotly contested new area of development.

Continued here:
Google says it has created a time crystal in a quantum computer, and it's weirder than you can imagine - ZDNet

Read More..