Page 2,542«..1020..2,5412,5422,5432,544..2,5502,560..»

A ransom note for the Gigabyte Hack has appeared on the internet! – Security News – BollyInside

You may recall that late last week, it was confirmed that tech manufacturer Gigabyte had been subjected to a successful hacking attempt and that around 112GB of supposedly exceptionally sensitive data had been stolen from them. Representing the latest big company to be subjected to a ransomware attack, at the time the news broke. At the same time, the hack did seem to be legitimate, there wasnt a lot of specific information about what the groups demands were or, more specifically, what kind of data theyd successfully stolen. Well, following a report via Videocardz, the supposed ransom note issued to Gigabyte has appeared online and, on the whole, it does appear that this could be quite a serious matter!

Gigabyte Ransomware Attack!The note itself is rather terse in its terms and makes it abundantly clear to Gigabyte that they have no interest in having their time wasted by speaking with less important people at the company. While it doesnt specifically cite any particular details, through my reading, it suggests that the ransom amount they are asking for is quite a significant sum. If the hacking group feels that they are being ignored or stalled, Gigabyte risks this going up even further.

What Data Has Been Stolen?Although still pending confirmation, various sources have cited that among the 112GB of information stolen, a significant portion of it may have contained some very sensitive data. This could potentially include:

Confidential documents with AMD and IntelDesigns/blueprints for upcoming technology releasesUEFI/BIOS/TPM security keysClassified product roadmapsSo, on the whole, this could be a very serious problem for Gigabyte, and it does raise the question as to whether they will decide to take the risk in allowing the hacking group to release this data publically or whether they choose to pay the ransom to prevent this from happening. We suspect that any decision regarding this may ultimately boil down to an evaluation of what was actually stolen, but as you might expect, making payments to ransomware groups is always going to be self-defeating as youre only effectively feeding the monster that is trying to kill you!

What do you think? Let us know in the comments!

News Summary:

Disclaimer: If you need to update/edit this news or article then please visit our help center.

See the original post here:
A ransom note for the Gigabyte Hack has appeared on the internet! - Security News - BollyInside

Read More..

Cloud computing specialist Beeks debuts new product and bullish over FY results – The Scotsman

The Aim-quoted firm, a cloud computing and connectivity provider for financial markets, said it has experienced good levels of trading in the second half of the year, notwithstanding the ongoing impact of Covid-19.

Furthermore, it expects to announce annual results delivering year-on-year growth in both revenue and underlying core earnings. The firm in March posted a 24 per cent year-on-year jump in interim revenues to 5.29 million.

"The group has continued to successfully expand its relationships with existing tier one customers in the second half and has a growing pipeline of opportunities, it has now added, also flagging an expanded, differentiated offering, growing sales network, and increased sales pipeline.

It is also confident in its ability to capitalise on the growing demand for cloud computing and connectivity from financial services organisations.

The firm has also launched Proximity Cloud, which it describes as a high-performance, dedicated and client-owned trading environment, fully optimised for low-latency trading conditions and built with security and compliance at the forefront.

The product has been backed by proceeds from the firms fundraise in April of this year, in which it generated gross proceeds of about 5.5m. Being hosted and managed on a client site, as opposed to within a Beeks facility, this new offering addresses a significant part of the market that was previously unavailable to the company, added the Glaswegian company.

It was founded by CEO Gordon McArthur after he became increasingly frustrated by the lack of low-latency trading infrastructure available.

A message from the Editor:

Thank you for reading this article. We're more reliant on your support than ever as the shift in consumer habits brought about by coronavirus impacts our advertisers.

More:
Cloud computing specialist Beeks debuts new product and bullish over FY results - The Scotsman

Read More..

How the Film Industry is Leveraging Cloud Computing? Lights, Camera, Action and Cloud – Analytics Insight

How the Film Industry is Leveraging Cloud Computing? Lights, Camera, Action and Cloud

The film industry has undergone many technological changes such as 3D, 4D, and even 5D. But it is not the end. Cloud computing is now emerging in the field of film technology. But a question arises about how cloud computing is helping the film industry. Cloud computing and film are two different things, where one deals with saving and transferring data from one place to another, and the other deals with entertaining people. These two different aspects are closer than our imagination.

Cloud computing allows centralization of data, information, and processes and serves easy access and perfect outline for films. This results in better decision-making, improved movie experience, and integrated operations.

Real-Time Discussions

In todays world, a films production crew, director, participating service partners, and castes are global. Cooperating among these teams becomes difficult as the physical transfer of data and film reels is a time-consuming process. But having access to the files remotely makes the process easy and saves time. This is where cloud computing benefits the film industry. Cloud computing helps in making real-time discussions which saves time, effort as well as money.

Remote Computing Power

For accumulating all the individual elements of a film including audio combinations, visual effects, and video filters to create the finished product, heavy computing power is required. The memory and processing power needed for making the final frames is much higher than what is available on standard computers. Cloud computing can be used to complete projects much rapidly by using remote computing power for the furnishing processes.

Security

Pre-release privacy causes a 19% decrease in revenue than piracy that occurs post-release. Cloud computing technology offers refined solutions to such security issues to the film industry. By creating private clouds, the film industry doesnt have to worry anymore about other studios gaining access to their system.

Streamline Operations

Starting with film distribution from theatrical planning, booking, and settlements to building a smoother customer-centric approach, cloud computing enables assessment of similar films to an individual theatre performance, giving distributors insights to secure a successful release.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Follow this link:
How the Film Industry is Leveraging Cloud Computing? Lights, Camera, Action and Cloud - Analytics Insight

Read More..

Datacenter tech: Here’s where the Open Compute Project is going next – ZDNet

The Open Compute Project (OCP), a computer engineering project launched in 2011 to create better datacenter hardware by sharing designs and ideas between its members, has announced the next stage of its project.

The best cloud storage services

Free and cheap personal and small business cloud storage services are everywhere. But, which one is best for you? Let's look at the top cloud storage options.

Read More

"Over the last 10 years, significant advancements have been made in open compute standards with the formation of multiple working groups delivering over 350 collaborations," said Rebecca Weekly, chair of OCP and senior director of hyperscale Strategy and Execution at Intel.

SEE: Google's new cloud computing tool helps you pick the greenest data centers

Weekly's stated goals for OCP 2.0 remain largely the same as the original purpose, including modularity, scalability, sustainability, and the ability to integrate across the stack. OCP also plans to "seed" innovation in optics, open silicon, AI, and cooling.

"While OCP continues to cover all aspects of modular hardware design compute, storage, switches, accelerators, and racks, there has been growing interest in forward-looking initiatives such as open hardware, chiplets, cooling and software solutions for broad community collaboration, to accelerate innovation and enable scale through ecosystem adoption," says Weekly.

OCP set out its seed plans as:

The open computing community includes the entire supply chain, from tech and data center equipment vendors, to cloud and communications companies, enterprises, system integrators and semiconductor manufacturers.

SEE: Supercomputers are becoming another cloud service. Here's what it means

Since it was formed out of a Facebook engineering project, OCP has gradually gained industry momentum. Besides its founding members Facebook, Intel and Rackspace and Goldman Sachs, Google Cloudjoined OCP in 2016, and Chinesecloud giant Alibabajoined in 2017.

Retail giant Target, a one-time AWS customer,joined OCP this yearwith contributions around edge-computing hardware. Microsoft uses OCP specs in its Azure cloud, while Google contributed a 48V power distribution rack so it can use OCP technology in its data center.

See original here:
Datacenter tech: Here's where the Open Compute Project is going next - ZDNet

Read More..

20 cloud stocks expected to increase sales the most over the next two years – MarketWatch

U.S. investors remain bullish, despite rumblings out of China and the spike in delta variant infections.

Cloud companies those at the forefront of the shift in computing power to distributed models over the internet are expected to grow at a rapid clip over the next several years, and four of the five largest exchange traded funds covering the space are close to hitting record highs.

Below is a screen of stocks held by those ETFs, showing which are expected to increase their sales the most through 2023. In an industry with many players at relatively early stages, increases in sales, rather than in earnings, might be the best driver of stock prices.

To begin the screen, we looked at the five largest cloud ETFs:

ETFs might be your best way to take a broad approach for a long-term play on the cloud revolution. If you are interested in any ETF, you should review the fund managers website.

Heres a comparison of total returns through Aug. 4, along with those for the SPDR S&P 500 ETF SPY and the Invesco QQQ Trust QQQ (which tracks the Nasdaq-100 Index NDX ) for comparison:

The ETFs approaches differ. For example, the ARK Next Generation Internet ETF ARKW is the only one that is actively managed. The others track an index. It is also the only one that holds shares of Tesla Inc. TSLA, which makes up 10.65% of the portfolio, according to information posted by ARK Invest on Aug. 5. Tesla is an electric-vehicle manufacturer, but it can also be considered a cloud company because it distributes software updates over the internet continually, and offers other cloud-based services.

Another holding unique to ARKW among the five cloud ETFs is Walt Disney Co. DIS, which is certainly an important cloud player through its Disney+ streaming service, even if the company doesnt say directly how much of its sales are derived from that rapidly growing segment.

As part of its description of ARKW, FactSet says the following:

Broadly speaking, the ARKWs managers appear focused on big buzzwords such as Internet of Things, cloud computing, digital currencies and wearable technology. While the funds focus may be appealing for investors with conviction in these new technologies, portfolio implementation is a more difficult task: Most of the companies developing these advancements are huge corporations for which nascent technologies are only a small fraction of total revenues. As such, its very difficult to get pure-play access to ARKWs targeted technologies so be sure to confirm that the funds holdings not just its thesis align with your view of the space.

Together, the five cloud ETFs listed above hold 147 stocks. To project sales growth through 2023, we used calendar 2020 sales estimates as a baseline and then looked at consensus estimates among analysts polled by FactSet for the subsequent three years, if available. (The 2020 numbers are estimates, because many companies fiscal years dont match the calendar.)

To make sure we had a solid set of estimates, we confined the group to the 126 companies covered by at least five analysts polled by FactSet, for which consensus sales estimates for calendar 2020 through calendar 2023 are available.

Here are the 20 companies projected to have the highest compound annual growth rates (CAGR) for sales through calendar 2023:

Click on the tickers for more about each company.

There are actually 21 stocks listed, including Zillow Group Inc.s Class A ZG and Class C Z shares.

It is interesting to see that the list is dominated by stocks held by ARKW. The fund has a broad definition of cloud companies and is focused also on sales growth.

Here are current forward price-to-sales ratios based on consensus estimates for the next 12 months, as well as ratios of current market cap to projected 2023 sales and summaries of analysts opinions about the stocks.

In comparison, the forward price-to-sales ratio for SPY is 2.8, with a price/2023 estimated sales ratio of 2.6. For QQQ, the current P/S is 4.7, declining to 4.3 for 2023.

Dont miss: Semiconductor stocks are rallying and they still look like bargains

Read this article:
20 cloud stocks expected to increase sales the most over the next two years - MarketWatch

Read More..

Software as a Service Reduces Cost of Modernization – GovernmentCIO Media & Research

SaaS solutions have demonstrated widespread potential for reducing both the financial resources and human capital necessary to streamline the IT systems of federal agencies.

The use of Software as a Service (SaaS) capacities have shown considerable potential for widely reducing the costs of modernizing federal agencies, particularly in terms of alleviating the operating costs and scope of human capital required to update their IT systems.

Among larger federal agencies, the Department of Veterans Affairs has already implemented SaaS within its efforts to standardize operating procedures across the agency and assist with broader cloud computing transformation. This has allowed VA to roll out software updates to existing employees more quickly while technically onboarding new employees more easily as well.

We switched to Microsoft 365 to get all of our email onto the platforms that most of our employees are using. Then we expanded out to Word and Excel so we could benefit from the fast releases and leverage OneDrive in those products, said Drew Myklegard, executive director of demand management at VA, in a GovFocus interview.

Myklegard also noted this has synced with the VAs switch to a public cloud platform, one that has expedited the delivery of software and technical updates across the agency as a whole.

We also have switched to mostly cloud-provided CRM and ERP just for the ability to add tens of thousands of employees who are able to get those expected software releases, Myklegard said.

Agencies looking to leverage SaaS should analyze its direct benefits for their specific modernization programs and map out how to incorporate these in advance.

I would encourage agencies to go back and look at what their real drivers are. One of the values that we've seen among our own customers has been around looking at what you already have in your existing capabilities, and looking at those demands to identify what are the best candidates [for SaaS] in particular. We have a lot of customers that are looking to move away from on-prem and get out of all the software maintenance, get out of the patching, and get out of that technological debt. So they're doing the imagery to better understand how they'd get value out of it and how its aligned to their mission, said Chris Borneman, CTO of SoftwareAG Government Solutions.

Myklegard noted that SaaS has particular value for large agencies with an expansive workforce and scope of mission responsibilities, with SaaS allowing departments to more easily manage large quantities of data and internal communications even amidst software updates and broader IT transformation.

We have 130,000 major medical facilities and 2,500 minor facilities, as well as all of our cemeteries and regional benefits offices. So when we consolidated our facilities API, we consolidate about 40 or 50 major databases in the backend. Technically that kind of transition isnt hard, but the governance behind it was extremely challenging. You've got to make sure that the service is up and youre able to make quick changes, Myklegard said.

Borneman outlined that this reduces operational costs, allowing agencies to install software updates and implement new capacities in ways that are less taxing to both their budgets and ability to continuously operate.

You're seeing a reduction in cost because you're not maintaining all the staff to go through and do the upgrades and do the continuous training. You're seeing a great reduction in technology debt. If you find a bug, you're able to fix it very quickly, whereas in a traditional model those are very large costs, Borneman said.

Myklegard explained these benefits have already provided considerable returns for VA, an agency undergoing a substantial scope of IT modernization technical leadership has needed to reconcile with the VAs demanding scope of responsibilities including the delivery of health care amidst a pandemic.

Anytime a system goes down, it costs us money. When our electronic health record system is down, it's about $1 million a minute. We all intrinsically know that if the system goes down and our people can't work, that's going to be expensive. A couple years ago, almost weekly we would have high priority incidents where our systems were down and you'd push code, and it took you a long time to recover. It was really tough to manage. You're spending half your day trying to solve the problem and the other half trying to communicate to your customers what's gone down. SaaS has just radically reduced that, Myklegard said.

More:
Software as a Service Reduces Cost of Modernization - GovernmentCIO Media & Research

Read More..

ORock Technologies Partners with Commvault to Set New Standards for Comprehensive Backup and Recovery in the Cloud – PRNewswire

RESTON, Va., Aug. 10, 2021 /PRNewswire/ --ORock Technologies, Inc., ahigh-performance hybrid cloud service provider built on OpenStack and certified by FedRAMP and the Department of Defense, today announced it has entered into a partnership with Commvault (NASDAQ: CVLT), a recognized global leader in the management of data across cloud and on-premises environments, to expand data protection and offer a seamless migration to the cloud for mission-critical workloads. This cost-effective solution is ideally suited for any business operating in a highly regulated industry as well as any federal, state, local or education (SLED) government agency seeking comprehensive backup and recovery in a secure cloud environment.

Commvault offers an all-in-one solution combining Commvault Backup & Recovery with Commvault Disaster Recovery for enterprise-grade data protection software that is both powerful and easy to use. ORock offers a secure network architecture and open source, scalable cloud environment that enable organizations to capitalize on the flexibility and scalability of cloud computing, while protecting workloads from security attacks. As highly regulated businesses and federal and SLED agencies come under increasing cyber threats, the need for comprehensive backup and recovery and disaster recovery planning in a secure cloud infrastructure has never been more vital.

The ORock-Commvault partnership allows organizations to use ORock as a validated S3-compatible Object Storage target for their data protection. Service providers and customers utilizing Commvault's Intelligent Data Services, anchored by ORock's government-grade secure cloud, can seamlessly back up and recover data and applications, virtual machines and containers, along with verifiable recoverability of replicas, cost-optimized cloud data mobility, security and resilient ransomware protection, and more. With the combined solution, Commvault users can access, move or repatriate their terabyte stores of data as often as they like without any data egress charges.

"We are happy to announce a high-value, innovative cloud partnership with ORock that benefits our customers in a number of ways," said David Foth, Sales Director, North America Service Providers, Commvault. "Enterprises will only accelerate to the cloud if a robust security apparatus is there to protect all of their data and applications in backup. ORock, known for their rigorous security controls, high-touch customer support and no egress charges, is a leading choice for businesses seeking a trusted hybrid cloud innovator. Our joint solution gives our customers a secure, convenient and cost-effective solution to modernize their data management strategy and team with market leaders."

"We're living in a critical time when companies are planning a hyperconnected multi-cloud future and need the assurance that their cloud ecosystem prioritizes security, accelerates performance and lowers costs," said Gregory Hrncir, Co-Founder, CEO and President, ORock Technologies. "The ORock-Commvault partnership makes it incredibly easy to shift, manage and safeguard workloads in the cloud, shielding organizations from data corruption and ransomware while slashing TCO and storage costs. This is a powerful combination of technologies that will propel your enterprise forward and keep operations running efficiently."

Learn more about how you can leverage the ORock-Commvault solution by visiting ORock.

About ORock Technologies

ORock Technologies delivers hybrid cloud and IaaS solutions designed for secure, compliant data operations. ORock helps leading organizations protect their most sensitive data, control costs and minimize vendor lock-in while enabling a range of IT modernization, application hosting, migration and edge computing initiatives. ORock's private fiber optic backbone network and enterprise-grade open source cloud feature the latest HPE Gen10 secure hardware and a flat-rate OPEX billing model with no data egress fees. These solutions support hybrid, private and multi-cloud capabilities while providing superior security, performance, predictability and control. Learn more about ORock.

CONTACT:Claudia CahillORock Technologies571-386-0201[emailprotected]

SOURCE ORock Technologies, Inc.

Homepage

View original post here:
ORock Technologies Partners with Commvault to Set New Standards for Comprehensive Backup and Recovery in the Cloud - PRNewswire

Read More..

Dizzying year sees Rackspace IPO, a dramatic restructuring and a mass layoff. Will the company live up to expectations in 22? – San Antonio…

Rackspace Technology, like the rest of us, has had quite a year.

Apollo Global Management, the behemoth investment firm that took Rackspace private in a $4.3 billion deal in 2016, returned the San Antonio cloud computing company to the stock market 12 months ago.

Since then, Rackspace has cut its workforce and shifted its strategy within the fast-moving cloud computing business. Its stock price has gyrated along the way.

As investors have run hot and cold on the company over the past year, Rackspace has produced strong revenue growth and is expected to generate about $3 billion in revenue this year. That would mark two consecutive years of double digit growth since 2019, when the company posted revenue of $2.4 billion.

We really feel good about the revenue growth sustainability, Rackspace CFO Amar Maletira said during the firms first-quarter earnings call in May. We can continuously drive double-digit growth into 2021 and beyond.

On ExpressNews.com: San Antonio-based Rackspace cuts 10 percent of workforce

1998: Rackspace founded

2008: Goes public at $12.50 per share, raising $187.5 million.

2013: Shares climb to about $80.

Feb. 2016: Shares fall as low as $17 amid competition from larger-scale cloud-computing services.

Nov. 2016: Acquired by Apollo Global Management Inc. for $32 a share, taken private in deal valued at $4.3 billion.

April 2020: Apollo registers Rackspace for an IPO that could value it at more than $10 billion.

June 2020: Saying it more accurately reflects new focus, changes name to Rackspace Technology Inc.

July, 10, 2020: Registration statement for IPO shows debt of nearly $4 billion, up from $493 million in last filing as public company in 2016.

July 27, 2020: New filing shows it expects to raise nearly $925 million in IPO.

Aug. 5, 2020: Shares debut at $21 and fall nearly 22 percent to close at $16.39 after first day of trading. Selling 33.5 million shares, IPO raises $703.5 million.

Aug. 31, 2020: In first earnings report since IPO, reports loss of $33 million on $657 million revenue in second quarter. Shares close at $19.33.

Nov. 10, 2020: Reports loss of $101 million on $682 million in revenue in third quarter. Shares close at $17.60.

Jan. 26: Taking advantage of low interest rates, refinances nearly $2.9 billion in debt "to repay all borrowings outstanding."

Feb. 19: Reports loss of $64 million on $716 million revenue in fourth quarter. Shares close at $20.93.

March 19: Regulatory filing touches off speculation company could be an acquisition target. Shares close at $24.13.

May 10: Reports loss of $64 million on revenue of $726 million in first quarter. Shares close at $24.02.

May 11: In response to company's forecast for slower growth, shares crater to $19.01.

July 22: Lays off 700 employees, about 10 percent of global workforce, in restructuring.

Thursday: Shares close at 17.13.

Despite its increasing revenue, Rackspace lost $246 million in 2020 and, through the first three months of this year, reported a $64 million loss. The company is projecting to lose between $30 million and $50 million in the second quarter of this year.

Launched here in 1998, Rackspace grew fast in its early years, exciting struggling and often overlooked entrepreneurs in San Antonio and drove city and business leaders appetite for a more muscular technology industry locally.

As demand for cloud services has ratcheted up, Rackspace acquired four companies including Onica, a cloud services and management firm, and Datapipe, a managed services provider for private and public cloud customers for a total of $1.7 billion in cash and stock since 2017.

Rackspace opened its Open Cloud Academy in 2013 to train tech workers; CodeUp acquired the academy in April for an undisclosed amount. And as they left Rackspace over the years, founders and senior executives began advising local startups and establishing investment funds.

But in its hunt to reach profitability, Rackspace has reshaped its business over the past half-decade. The company used to host websites for companies, and eventually found itself in competition with Amazon, Google and Microsoft.

And if you cant beat em, join em.

More recently, Rackspace has focused on providing cloud services, where it helps companies move their data onto the cloud, which means remote data storage.

And firms are increasingly going to a multicloud strategy where they use more than one cloud service. Think of working with Amazon Web Services and Google Cloud at the same time, for example.

Pursuing a mulitcloud strategy allows a company to use the best functions of each cloud service, such as data transfer or automation functions. And using different cloud services buffers a company from the risk of one service failing.

But using multicloud services can be complex, and firms hire Rackspace to help them make the transition.

The market for multicloud services is expected to reach $520 billion by 2024, a roughly $200 billion increase, according to Gartner, an information technology firm.

While Rackspace shifts its business internally, it has shed employees.

Last month, the company laid off 700 workers about 10 percent of its global workforce. Rackspace said it is investing in growth areas like cloud migration, Elastic Engineering, professional services, cloud native application development, Data, Artificial Intelligence, Machine Learning and Security services.

We are restructuring the company to fuel the investment, and to position our business for these hyper-growth areas, a spokeswoman said.

In 2019, Rackspace cut 200 jobs and moved another 125 to India. In 2017, it cut fewer than 100 positions and issued another 275 pink slips the following year.

On ExpressNews.com: Starbase surge, more Houston St. chatter and other spare parts from the Texas tech scene

Analysts have largely given Rackspace a pass for its heavy losses in recent years because the company has been investing in its own growth. Ten Wall Street analysts covering Rackspace still recommend investors buy the stock.

But its unclear when investors appetite for profit will exceed their hunger for potential growth in the future.

In mid-March, a document Rackspace filed with the Securities and Exchange Commission outlined compensation plans for its top executives in the event of an ownership change. It appeared to suggest the company may be acquired by a rival such as Amazon.

The news sent the stock price to a record high, when it topped $26 per share in early April.

But when Rackspace reported second-quarter earnings in May, the companys share price plummeted even though it exceeded expectations for the quarter.

The stock price cratered, falling by 21 percent in a single day, largely because the companys targets for the second quarter and full year suggested Rackspaces earnings wouldnt grow as quickly as analysts wanted.

Despite calling for second-quarter revenues above analyst targets, the companys forecast of slower earnings growth was enough to shake Wall Street.

The stock price recovered some after Rackspaces first-quarter presentation, but it has struggled to rise since. The companys shares have fallen nearly 14 percent over the last month. They traded just above $17 on Thursday.

Rackspace will report second-quarter results on Wednesday.

diego.mendoza-moyers@express-news.net

Read the original here:
Dizzying year sees Rackspace IPO, a dramatic restructuring and a mass layoff. Will the company live up to expectations in 22? - San Antonio...

Read More..

Hyperion Research’s Steve Conway on the State of AI and the HPC Connection – HPCwire

Looking for an AI refresher to beat the Summer heat? In this Q&A, Hyperion Research Senior Adviser Steve Conway surveys the AI and analytics landscape in a time of intense activity and financial backing. Just last week, the National Science Foundation (NSF) announced it had expanded the National AI Research Institutes program to 40 states (and the District of Columbia) as part of a combined $220 million investment. What is all this attention and investment leading up to? What is significant right now? Whats the HPC connection? Keep reading for insights into the questions everyones asking.

HPCwire: How would you describe the status of AI today?

Conway: AI is at an early developmental stage and is already very useful. The mainstream AI market is heavily exploiting early AI for narrow tasks that mimic a single, isolated human ability, especially visual or auditory understanding, for everything from Siri and Alexa to reading MRIs with superhuman ability.

HPCwire: Whats the eventual goal for AI?

Conway: The goal over time is to advance toward artificial general intelligence (AGI), where AI machines are versatile experiential learners and can be trusted to make difficult decisions in real time, including life-and-death decisions in medicine and driving situations. Experts debate what it will take to get there and whether that will happen. Hyperion Research asked noted AI experts around the world about this in a recent study. The sizeable group who believe AGI will happen said, on average, it will take 87 years. There was an outlier at 150 years. But whether or not it happens, AGI is an important aspirational goal to work toward.

HPCwire: What role does HPC play in AI?

Conway: HPC is nearly indispensable at the forefront of AI research and development today, for newer, economically important use cases as well as established scientific and engineering applications. One reason why HPC is attracting more attention lately is that it is showing where the larger, mainstream AI market is likely headed in the future. The biggest gifts HPC is giving to that market are 40-plus years of experience with parallelism and the related abilities to process and move data quickly, on premises and in more highly distributed computing environments such as clouds and other hyperscale environments. The HPC community is also an important incubator for applying heterogeneous architectures to the growing number of heterogeneous workflows in the public and private sectors.

HPCwire: Reversing that question, what role does AI play in HPC?

Conway: A recent Hyperion Research study showed that nearly all HPC sites around the world are now exploiting AI to some extent. Mostly, theyre using AI to accelerate established simulation codes, for example by identifying areas of the problem space that can be safely ignored. In cases where the problem space is an extremely sparse matrix, this heuristic approach can be especially helpful. HPC-enabled AI is also used for pre- and post-processing of data.

HPCwire: Whats the relationship between analytics and simulation in HPC-enabled AI?

Conway: Some applications use analytics alone, but many HPC-enabled AI applications benefit from both data analytics and simulation methodologies. Simulation isnt becoming less important with the rise of AI. This frequent pairing of simulation and analytics says that HPC system designs need to be compute-friendly and data-friendly. Newer designs are starting to reverse the increasing compute-centrism of recent decades and establish a better balance.

HPCwire: You mentioned newer, economically important use cases for HPC-enabled AI. Can you say more about those?

Conway: A few years ago, anecdotal evidence led Hyperion Research to compile a list of repetitive AI use cases that vendors could begin to pursue as emerging HPC market segments: precision medicine, automated driving systems, fraud and anomaly detection, business intelligence, affinity marketing, and IoT/smart cities/edge computing. Hyperion Researchs recently completed multi-client study of the worldwide HPC market found that 80 percent of the surveyed HPC sites already use one or more of these applications. Some of this is to support established HPC applications in HPC datacenters, but a surprising portion is to support business operations in enterprise datacenters. This confirmed the growth of a trend weve been tracking for a decade, where enterprise data analytics requirements are pushing up into the HPC competency space.

HPCwire: How is AI related to HPDA?

Conway: Hyperion Research defines high performance data analysis, HPDA, as data-intensive computing that uses HPC resources, whether for simulation or analytics. AI is the HPDA subset that involves data analytics, whether learning models or other analytics methods.

HPCwire: What about AI and cloud computing? Edge computing?

Conway: Our studies show that 20 percent of all HPC workloads are being run in third-party clouds and this number is growing, mostly not at the expense of on-premises computing. AI methods supporting HPC workloads are about as common in cloud settings as on premises. HPC also has a crucial role to play in the important subset of edge computing applications that need wide-area analysis and control, as opposed to just local responsiveness at the edge. A large portion of the one-time Top500-leading Tianhe-1a supercomputer, for example, was dedicated to urban traffic management in Guangzhou. Some respected thinkers believe HPC will be the glue that unifies the emerging global IT infrastructure, from edge to exascale.

HPCwire: Whats needed to move things forward? Are people working on these things?

Conway: Things are definitely moving forward, thanks in no small part to researchers advancing AI practices in the worldwide HPC community, but there are important challenges that are being worked on. They include making the operations of multilayered neural networks explainable and trustworthy, ramping up the availability of realistic synthetic data to address the shortage of useful real-world data in some domains, and advancing multimodal AI that can concurrently mimic more than one human sense. A more profound challenge concerns AI methodologies and the decades-old, rising debate between experts who believe learning models will be adequate for achieving AGI and those who say learning models mimic only high-level, abstract functions of intelligence and need to be augmented with methods that mirror our brains by directly experiencing the natural world.

HPCwire: The NSF recently announced it is expanding the National AI Research Institutes program to 40 states (and the District of Columbia) as part of a $220 million investment. That is one of many state-sponsored AI projects being launched around the world. Where does public investment fit in your view of AI?

Conway: The U.S., China, Europe and Japan all have government-funded initiatives aimed at increasing their AI capabilities as a prerequisite for scientific-engineering progress and economic competitiveness. They have analogous initiatives in HPC, which has already proven its ability to accelerate scientific and industrial innovation. For AI, HPC and other technologies with strong transformational potential, government investment is crucial for laying out national-regional goals and motivating progress toward those goals, especially when the technology is in an early and uncertain stage, as is certainly true of AI.

HPCwire: Stephen Hawking famously said, AI is likely to be either the best or worst thing to happen to humanity. Care to comment?

Conway: Who am I to question Dr. Hawking? I think theres a danger that AIs now-unstoppable momentum could overwhelm crucial ethical considerations, but especially in the past two years or so, more attention is being paid to the ethical ramifications of AI progress. With AGI predicted to be a century or so in the future at best, humanity has some time to wrestle with this.

Bio: Steve Conway is Senior Adviser of HPC Market Dynamics at Hyperion Research. Conway directs research related to the worldwide market for high performance computing. He also leads Hyperion Researchs practice in high performance data analysis (big data needing HPC).

Read the rest here:
Hyperion Research's Steve Conway on the State of AI and the HPC Connection - HPCwire

Read More..

Dive into cloud computing with 74 hours of AWS exam prep for $14 – BleepingComputer

By BleepingComputer Deals

Most businesses today rely on software and data that is stored in the cloud. AWS (Amazon Web Services) is by far the most popular solution, making up nearly one-third of the market.

If you would like to expand your knowledge of IT or work specifically in this niche, taking the official AWS exams is a must.

The 2021 Ultimate AWS Certified Solutions Architect Associate Exam Prep Bundle helps you pass the tests the first time, with seven courses working towards Amazon exams. You can get it today for just $14 with code ANNUAL60 in the Semi-Annual Sale at Bleeping Computer Deals.

From project management platforms to accounting software, you wont find many apps today that run locally without connecting to the cloud. AWS powers some of the biggest names around, including Adobe, Autodesk, Slack, Zillow, and even NASA.

With this bundle, you can learn how to keep any AWS setup running smoothly. The beginner-friendly training walks you through the AWS Management Console, key security measures, and the most popular services.

You also learn about access management, how to handle AWS databases, and the process of designing resilient cloud architecture.

Just as importantly, this bundle helps you prove your knowledge to recruiters. You get 74 hours of exam prep in total, working towards AWS Certified Solutions Architect Associate certification exam and others.

All the content comes from Total Seminars, a training platform used by the United Nations and the U.S. Department of Defense.

Its worth $1,400, but you can order the bundle today for just $14 with the 60% discount code ANNUAL60.

Prices subject to change.

Disclosure: This is a StackCommerce deal in partnership with BleepingComputer.com. In order to participate in this deal or giveaway you are required to register an account in our StackCommerce store. To learn more about how StackCommerce handles your registration information please see the StackCommerce Privacy Policy. Furthermore, BleepingComputer.com earns a commission for every sale made through StackCommerce.

Original post:
Dive into cloud computing with 74 hours of AWS exam prep for $14 - BleepingComputer

Read More..