Page 1,989«..1020..1,9881,9891,9901,991..2,0002,010..»

FBI sounds the alarm over virulent new ransomware strain – TechRadar

A virulent new ransomware strain has infected at least 60 different organizations in the last two months, the FBI has warned.

In a Flash report, published late last week, the intelligence agency said that BlackCat, a known ransomware-as-a-service actor, compromised these organizations using a strain written in RUST.

This is somewhat unusual given that most ransomware is written in C or C++. However, the FBI believes these particular threat actors opted for RUST as its considered to be a more secure programming language that offers improved performance and reliable concurrent processing.

BlackCat, also known as ALHPV, usually demands payment in Bitcoin and Monero in exchange for the decryption key, and although the demands are usually in the millions, has often accepted payments below the initial demand, the FBI says.

BlackCat also has strong ties to Darkside (aka Blackmatter), the FBI further explains, suggesting that the group has extensive networks and experience in operating malware and ransomware attacks.

The attack usually starts with an already compromised account, which gives the attackers initial access to the target endpoint. The group then compromises Active Directory user and administrator accounts, and uses Windows Task Scheduler to configure malicious Group Policy Objects (GPOs), to deploy the ransomware.

Initial deployment uses PowerShell scripts, in conjunction with Cobalt Strike, and disables security features within the victims network.

The attackers are then said to download as much data as possible, before locking up the systems. And they even look to pull data from any cloud hosting providers they could find.

Finally, with the help of Windows scripting, the group seeks to deploy ransomware onto additional hosts.

The FBI has also created a comprehensive list of recommended mitigations, which include reviewing domain controllers, servers, workstations, and active directories for new or unrecognized user accounts; regularly backing up data, reviewing Task Scheduler for unrecognized scheduled tasks, and requiring admin credentials for any software installation processes.

Continued here:
FBI sounds the alarm over virulent new ransomware strain - TechRadar

Read More..

A new vision of artificial intelligence for the people – MIT Technology Review

But few people had enough mastery of the language to manually transcribe the audio. Inspired by voice assistants like Siri, Mahelona began looking into natural-language processing. Teaching the computer to speak Mori became absolutely necessary, Jones says.

But Te Hiku faced a chicken-and-egg problem. To build a te reo speech recognition model, it needed an abundance of transcribed audio. To transcribe the audio, it needed the advanced speakers whose small numbers it was trying to compensate for in the first place. There were, however, plenty of beginning and intermediate speakers who could read te reo words aloud better than they could recognize them in a recording.

So Jones and Mahelona, along with Te Hiku COO Suzanne Duncan, devised a clever solution: rather than transcribe existing audio, they would ask people to record themselves reading a series of sentences designed to capture the full range of sounds in the language. To an algorithm, the resulting data set would serve the same function. From those thousands of pairs of spoken and written sentences, it would learn to recognize te reo syllables in audio.

The team announced a competition. Jones, Mahelona, and Duncan contacted every Mori community group they could find, including traditional kapa haka dance troupes and waka ama canoe-racing teams, and revealed that whichever one submitted the most recordings would win a $5,000 grand prize.

The entire community mobilized. Competition got heated. One Mori community member, Te Mihinga Komene, an educator and advocate of using digital technologies to revitalize te reo, recorded 4,000 phrases alone.

Money wasnt the only motivator. People bought into Te Hikus vision and trusted it to safeguard their data. Te Hiku Media said, What you give us, were here as kaitiaki [guardians]. We look after it, but you still own your audio, says Te Mihinga. Thats important. Those values define who we are as Mori.

Within 10 days, Te Hiku amassed 310 hours of speech-text pairs from some 200,000 recordings made by roughly 2,500 people, an unheard-of level of engagement among researchers in the AI community. No one couldve done it except for a Mori organization, says Caleb Moses, a Mori data scientist who joined the project after learning about it on social media.

The amount of data was still small compared with the thousands of hours typically used to train English language models, but it was enough to get started. Using the data to bootstrap an existing open-source model from the Mozilla Foundation, Te Hiku created its very first te reo speech recognition model with 86% accuracy.

Continue reading here:
A new vision of artificial intelligence for the people - MIT Technology Review

Read More..

New Navy Artificial Intelligence-Enhanced Drones Are Ready to Set Sail – The National Interest Online

The U.S. Navys artificial intelligence-enabled, autonomous drones are already functional, and many new types of systems are set to advance beyond the conceptual and prototype stages. The Navy intends for these systems to not only network with one another but also function autonomously. To expedite this process, the Navy is standing up and improving its Rapid Autonomy Integration Lab.

Algorithms enabling greater levels of autonomy are progressing quickly, and the Navy is already leveraging them to engineer and test a fleet of coordinated, integrated unmanned systems that can network with one another, synchronize, and execute time-sensitive missions without needing human involvement. As part of the Navys Ghost Fleet Overlord program, these drones will not only utilize their autonomous capability on an individual scale, but will also participate in collective, autonomous missions that are enabled by common software interfaces and AI-enabled data processing. Navy weapons developers increasingly plan to improve levels of autonomy as technology progresses.

For subsurface platforms, we have small, medium, and large. We currently have four prototypes today. They're demonstrating increasing autonomous capabilities and discovering new opportunities, new exercises, Capt. Scot Searles, Unmanned Maritime Systems program manager, told an audience at the 2022 Sea Air Space Symposium.

The first two of Ghost Fleet Overlords opening program vessels, Ranger and Nomad, were initially executed by the Strategic Capabilities Office, a specialized Pentagon unit designed to find, develop and integrate new innovations for operational use in the force. But due to their successful and rapid development, these two autonomous surface vessels have been transitioned to the Navy.

Alongside these, more prototypes are in development and on the way, Searles explained.

Now we're in the second phase of prototyping. We have two more vessels that are Navy funded this time under construction, the first of which is delivered. That is GFE (Government Furnished Equipment) is being installed right now, and the other one is under construction. We also have two smaller type prototype vessels as well.

Autonomous unmanned systems are already reshaping Navy concepts of operation and will continue to do so at a blistering pace. Of course, while Pentagon doctrine ensures that no lethal force is authorized without a human-in-the-loop, unmanned systems will continue to perform a much wider range of operations than has previously been possible. For instance, a Ghost Fleet or group of integrated unmanned systems could survey an enemy coastline and test enemy defenses, assess a threat environment, and exchange relevant data regarding an optimal point of attack. Targeting specifics could be shared across a group of unmanned systems in real time with the hope of quickly pairing new targeting information with shooters of modes of attack to eliminate enemies quickly. Yet another key advantage with this is that unmanned systems improve survivability, as they can allow manned ships and sailors to operate at safer stand-off distances. In the future, for example, Sea Basing is expected to take on a larger role and big-deck amphibious assault ships may increasingly function as mother ships, performing command and control and operating an entire small fleet of drones at one time.

Kris Osborn is the Defense Editor for the National Interest. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the ArmyAcquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Masters Degree in Comparative Literature from Columbia University.

Image: Flickr.

Continued here:
New Navy Artificial Intelligence-Enhanced Drones Are Ready to Set Sail - The National Interest Online

Read More..

Is artificial intelligence the future of warfare? – Al Jazeera English

From: UpFront

We discuss the risks behind autonomous weapons and their role in our everyday lives.

If were looking for that one terminator to show up at our door, were maybe looking in the wrong place, says Matt Mahmoudi, Amnesty International artificial intelligence researcher. What were actually needing to keep an eye out for are these more mundane ways in which these technologies are starting to play a role in our everyday lives.

Laura Nolan, a software engineer and a former Google employee now with the International Committee for Robot Arms Control, agrees. These kinds of weapons, theyre very intimately bound up in surveillance technologies, she says of lethal autonomous weapons systems or LAWS.

Beyond surveillance, Nolan warns that: Taking the logic of what were doing in warfare or in our society, and we start encoding it in algorithms and processes can lead to things spinning out of control.

But Mahmoudi, says there is hope for banning autonomous weapons, citing existing protections against the use of chemical and biological weapons. Its never too late, but we have to put human beings and not data points ahead of the agenda.

On UpFront, Marc Lamont Hill discusses the risks behind autonomous weapons with the International Committee for Robot Arms Controls Laura Nolan and Amnesty Internationals Matt Mahmoudi.

See more here:
Is artificial intelligence the future of warfare? - Al Jazeera English

Read More..

How to amend the Artificial Intelligence Act to avoid the misuse of high-risk AI systems – The Parliament Magazine

As the opinion rapporteur for the Artificial Intelligence Act in the Committee on Culture and Education (CULT), I will present a proposal for amending the Artificial Intelligence Act in March. The draft focuses on several key areas of artificial intelligence (AI), such as high-risk AI in education, high-risk AI requirements and obligations, AI and fundamental rights as well as prohibited practices and transparency obligations.

The regulation is aiming to create a legal framework that prevents discrimination and prohibits practices that violate fundamental rights or endanger our safety or health.One of the most problematic areas is the use of remote biometric identification systems in public space.

Unfortunately, the use of such systems has increased rapidly, especially by governments and companies to monitor places of gathering, for example. It is incredibly easy for law enforcement authorities to abuse these systems for mass surveillance of citizens. Therefore, the use of remote biometric identification and emotion recognition systems is over the line and must be banned completely.

Moreover, the misuse of technology is concerning. I am worried that countries without a functioning rule of law will use it to persecute journalists and prevent their investigations. It is obviously happening to a certain extent in Poland and Hungary, where governments have used the Pegasus software to track journalists and members of the opposition. How hard will it be for these governments to abuse remote biometric identification, such as facial recognition systems?

It is absolutely necessary to set rules that will prevent governments from abusing AI systems to violate fundamental rights

As far as we know, the Hungarian government has already persecuted journalists in the so-called interest of national security for questioning the governments actions amid the pandemic. Even the Chinese social credit system, which ranks the countrys citizens, is based on the alleged purpose of ensuring security.

It is absolutely necessary to set rules that will prevent governments from abusing AI systems to violate fundamental rights. In October, a majority of the European Parliament voted in favour of a report on the use of AI in criminal law. The vote showed a clear direction for the European Parliament in this matter.

The proposal includes a definition of so-called high-risk AI systems. HR tools that could filter applications, banking systems that evaluate our creditworthiness and predictive control systems all fall under the definition of high-risk because they could easily reproduce bias and have a negative impact on disparity.

With AI being present in education as well, the proposal includes test evaluation and entrance examination systems. Still, this list shall be expanded to include online proctoring systems. However, there is a problem with different interpretations of the GDPR in the case of online proctoring systems, resulting in differences in personal data protection in Amsterdam, Copenhagen and Milan.

According to the Dutch and Danish decisions, there was no conflict between online proctoring systems and the GDPR, but the Italian data protection authority fined and banned further use of these technologies. Currently, universities are investing in new technologies without knowing whether they are authorised to use them or if they are going to be fined.

HR tools that could filter applications, banking systems that evaluate our creditworthiness and predictive control systems all fall under the definition of high-risk because they could easily reproduce bias and have a negative impact on disparity

In my opinion, technologies used for students personalised education should be included in the high-risk category as well. In this case, incorrect usage can negatively affect a students future.

In addition to education, the CULT committee focuses on the media sector, where AI systems can be easily misused to spread disinformation. As a result, the functioning of democracy and society may be in danger.

When incorrectly deployed, AI systems that recommend content and learn from our responses can systematically display content which form so-called rabbit holes of disinformation. This increases hatred and the polarisation of society and has a negative impact on democratic functioning.

We need to set clear rules that will not be easy to circumvent. Currently, I am working on a draft legislative opinion which will be presented in the CULT committee in March. I will do my best to fill all the gaps that I have identified.

The Council is also working on its position. A common compromise presented by the Slovenian presidency was found, for example, in the extension of social scoring from public authorities to private companies as well.

Read more:
How to amend the Artificial Intelligence Act to avoid the misuse of high-risk AI systems - The Parliament Magazine

Read More..

Application of Artificial Intelligence in telecommunications – Telecom Lead

Artificial intelligence, machine learning, and business intelligence are being widely used to boost the success and capabilities of various organizations. Even telecom industries utilize AI to eradicate network issues, poor data analysis, high costs, and a crowded marketplace.As a telecommunication company running certain operations remotely, you will need specific data and software to help you manage your work. A software that is perfect for remote businesses is coAmplifi. It is an excellent option as it allows you to boost productivity and monitor your employees right from the comfort of your home.

What Is Artificial Intelligence?

Artificial intelligence is a part of computer science. AI was built to allow machines to use human knowledge to perform various tasks efficiently. Systems embedded with artificial intelligence can easily replicate human behavior if they have been fed the correct information. Artificial intelligence systems can learn, plan, reason, solve problems and make decisions.

Common AI Applications in the Telecommunication Industry

Fraud Elimination and Revenue Growth

AI can be used to eliminate and reduce the chances of fraudulent activity. Since machine learning processes and AI algorithms work in real-time, they can easily detect fraudulent transactions, unauthorized access, and dupe profiles. As soon as fraud is detected, the system automatically blocks access to prevent the loss of important company information and other assets.

AI can be used to increase revenue by driving in more subscribers. Data analysis and identifying patterns in data generated by networks, cellphones, geolocation, user profiles, billing, telecom devices, and service usage can help predict solutions to boost success. The company can then use these to sell its services smartly in real-time. An AI system will make the right offer to the right customer at the right time. This boosts revenue to the maximum.

Network Optimization

Since the launch of 5G networks in 2019, around 1.7 billion people have subscribed to the service. AI is used to create self-optimizing networks that can handle the growth and demand caused by increased subscribers.

The AI system allows automatic network optimization by focusing on traffic, time zone, and regions. Advanced algorithms are used to predict trends, generate patterns, and eradicate problems within the network.

Predictive Analytics

Telecom AI systems use predictive analytics generated from data, algorithms, and machine learning to predict future trends. Trends are deduced through comparison with old data. The current state, failure, and other patterns are quickly visible, which can prevent the company from faltering.

AI software allows CSPs to locate problems in control towers, data cables, service centers, hardware, and even telecom devices installed in users homes.

Virtual Assistants

AI can help provide virtual assistants for customer support. These assistants can easily take on one-on-one conversations and personalize the whole experience. The system is embedded with answers to general queries such as installation help, setup, maintenance, troubleshooting, and other network issues.

Since there are numerous queries daily, the AI system reduces the burden on customer support agents or eradicates the need for them at all.

Robotic Process Automation

Robotic Process Automation, or RPA, is a business automation system that utilizes AI to handle backend operations, repetitive actions, and processes based on fixed rules.

It eradicates human errors by automating transactions and other procedures. The workforce can then focus on other essential tasks such as billing, data entry, management, and order completion.

Endnote

Combining AI and telecommunication can help businesses grow, modernize and be more successful. Since everyone is trying to modernize, there is no reason you should stay behind. Doing so will only make you less competitive as a business.

See the original post:
Application of Artificial Intelligence in telecommunications - Telecom Lead

Read More..

MEDIA ALERT: Business Insurance to Host Webinar "How Artificial Intelligence is Transforming the Insurance Industry" – Yahoo Finance

Gradient AI Sponsors Webinar to Explore the Promise and Challenges of AI in the Insurance Industry

April 21, 2022--(BUSINESS WIRE)--Gradient AI

WHAT: Business Insurance is hosting a webinar "How Artificial Intelligence is Transforming the Insurance Industry." Sponsored by Gradient AI, a leading provider of proven artificial intelligence (AI) solutions for the insurance industry, this webinar will cover real-world use cases, and explore AIs powerful benefits enabling attendees to gain an actionable understanding of AIs potential and its value to their business.

WHEN: April 26, 20221:00 PM - 2:00 PM EDT/10:00 AM -11:00 AM PDT

WHO: Featured Speakers include:

Builders Insurances Mark Gromek, Chief Marketing and Underwriting Office

Florida State Universitys Dr. Patricia Born Midyette, Eminent Scholar in Risk

CCMSIs S. F. "Skip" Brechtel, Jr., FCAS, MAAA, Executive VP and CIO

WHY ATTEND: As digital transformation has disrupted many industries, AI is poised to do the same for insurance enterprises. Attendees will learn:

How to use AI to gain a competitive advantage and generate improved business outcomes such as improved key operational metrics

How AI can increase the efficiency and accuracy of underwriting and claims operations

The challenges and opportunities facing the next generation of insurance professionals

WHERE: Learn more and register here.

Tweet this: How Artificial Intelligence is Transforming the Insurance Industry Webinar: April 26, 2022, 1:00 pm EDT https://register.gotowebinar.com/register/5324033635572250381?source=GradientAI #AI #insurance #insurtech

About Gradient AI:

Gradient AI is a leading provider of proven artificial intelligence (AI) solutions for the insurance industry. Its solutions improve loss ratios and profitability by predicting underwriting and claim risks with greater accuracy, as well as reducing quote turnaround times and claim expenses through intelligent automation. Unlike other solutions that use a limited claims and underwriting dataset, Gradient's software-as-a-service (SaaS) platform leverages a vast dataset comprised of tens of millions of policies and claims. It also incorporates numerous other features including economic, health, geographic, and demographic information. Customers include some of the most recognized insurance carriers, MGAs, TPAs, risk pools, PEOs, and large self-insureds across all major lines of insurance. By using Gradient AIs solutions, insurers of all types achieve a better return on risk. To learn more about Gradient, please visit https://www.gradientai.com.

Story continues

View source version on businesswire.com: https://www.businesswire.com/news/home/20220421005382/en/

Contacts

Elyse Familantelysef@resultspr.net 978-376-5446

See more here:
MEDIA ALERT: Business Insurance to Host Webinar "How Artificial Intelligence is Transforming the Insurance Industry" - Yahoo Finance

Read More..

Tech update: Using artificial intelligence to solve supply chain snarls, and consolidation in Canadian crypto – Toronto Star

As Russias Ukraine invasion fans the flames of global inflation that was already on the rise, the Bank of Canada is stepping in to try and put the brakes on surging prices here at home. The central bank hit Canadians this month with its first oversized interest rate hike in decades, a half a percentage point.

The war is also driving up the prices of energy and other commodities, further disrupting global supply chains, with freighters full of commercial goods stuck at overwhelmed ports.

Canadas top artificial intelligence companies believe they can develop strategies and programs to get products to market faster and that makes the timing ripe for a new Canadian A.I. startup program.

Federal artificial intelligence agency Scale AI has announced phase two of its supply chain venture accelerator. It will support the growth and commercialization of a dozen promising Canadian A.I. companies through the Supply AI program, delivered by the MaRS Discovery District.

The 12 startups will work with experts to scale their companies, grow market share and increase their exposure to potential new investors.

Supply chain obstacles: Many of the products and services we use these days are inherently complex. Some require thousands of parts or co-ordination of suppliers across multiple geographies, says Osh Momoh, chief technical adviser at MaRS. Think of vaccines, automobiles or the consumer products Amazon delivers to our doors.

How A.I. can help: Many startup founders, like the ones in the Supply AI program, believe artificial intelligence can be used to forecast client demand for supplies and improve routing to move items faster. It can automate the physical movement of goods and the assembly of products in business environments such as warehouses.

Toronto-based Taiga Robotics is part of the program. It aims to reinvent factories with its fleet of A.I.-powered robots, which it rents out to small and medium-sized businesses to perform tasks such as sorting and packaging.

This alleviates strained labour resources by making robots more accessible in general, says CEO and co-founder Dmitri Ignakov, making them viable for workflows which are smaller than what would have justified an investment in traditional automation.

Canada as an A.I. leader: We have the talent. We need an ecosystem for it to thrive here. To thrive, Ignakov says A.I. ventures need help removing barriers to adoption in the marketplace.

These companies often need support, from being able to run pilots of public streets and roads to getting some assistance reducing the cost of initial deployments with their first customers.

The sharks presence grows in Canadas crypto tank

Kevin OLeary-backed crypto company WonderFi Technologies Inc. is about to buy its second crypto trading platform this year, Coinberry Ltd., for $38.5 million in shares.

The Vancouver-based WonderFi also recently acquired Toronto-based crypto exchange Bitbuy Technologies Inc., which puts two out of Canadas six registered cryptocurrency trading platforms under one companys control.

WonderFI CEO Ben Samaroo thinks having Shark Tank host OLeary as an investor gives them a competitive advantage.

Kevin is a major advocate for compliant investments, as compliance is required for institutional investors to get comfortable.

Canadian market consolidates: When the deal with Coinberry closes, WonderFi will own a third of Canadian licences for crypto platforms. The company will house more than 750,000 users across its ecosystem and employ more than 180 people, making it the countrys largest crypto company.

Martin Piszel, the CEO of crypto trading platform Coinsquare, says he expects to see further consolidation as companies try to gain overall market share and take advantage of potential synergies. The cost of regulation and the increasing cost of acquiring clients will drive platforms to look for ways to gain scale and improve efficiencies, he says.

Crypto grows up: The next step for the industry could be increased regulation. Some Canadian crypto companies operate under temporary two-year regulation licences. The next stage of their process is to register as a full Investment Industry Regulatory Organization of Canada (IIROC) dealer to create long-term stability, push out unlicensed operators and attract foreign investment.

Regulated platforms will become more valuable to the Canadian crypto market, Piszel says. Large foreign players will assess the cost and time of entering the Canadian marketplace and may opt to look for acquisition targets who already have done the heavy regulatory work, he says.

A window of opportunity in biotech

As part of the fight against COVID-19, Ontario recently announced increased access to a new therapeutic drug for patients infected with the virus. Newly approved Paxlovid reduces the risk of hospitalization and death in COVID patients by 89 per cent, according to a Pfizer study.

The pandemic has increased our awareness of the biotech industry, specifically the need to increase Canadas capacity to produce vaccines and therapeutics. But researchers say we shouldnt stop there.

The Innovation Economy Council (IEC) has just released a report that says Canada has a huge opportunity to take the lead in gene and cell therapy. The country has the capacity to create a multibillion dollar industry but it needs to expand its ability to manufacture treatments, conduct clinical trials and train talent.

Whats next? Toronto-based Centre for Commercialization of Regenerative Medicine (CCRM) has helped fund and lead a dozen promising startup cell and gene therapy companies, employing 250 people and raising an impressive $770 million in venture capital. This year, CCRM will break ground on a biomanufacturing plant at McMaster Innovation Park in Hamilton. The facility will help Canadian companies get their groundbreaking treatments to patients.

Why is cell and gene therapy so important? CCRM president and CEO Michael May says cell and gene therapies promise cures for diseases, not just treatment of symptoms.

This is revolutionizing medicine and we can see it playing out around the world today, May says. Canadian science has helped define this industry and I believe that industrializing the sector is Canadas opportunity for global leadership in life sciences.

Canadas next best move: May wants the federal government to invest in infrastructure, emerging companies, talent and new therapies.

Id like to see products stamped Made in Canada. Canada should be the go-to destination for capability and expertise and we must be a trailblazer in clinical adoption of these revolutionary therapies, May says. I can picture a vibrant ecosystem with Canadas ecosystem as the nexus of a global industry.

In other news:

Janey Llewellin writes about technology for MaRS. Torstar, the parent company of the Toronto Star, has partnered with MaRS to highlight innovation in Canadian companies.

Disclaimer This content was produced as part of a partnership and therefore it may not meet the standards of impartial or independent journalism.

See the original post:
Tech update: Using artificial intelligence to solve supply chain snarls, and consolidation in Canadian crypto - Toronto Star

Read More..

Carbon and the cloud: Why data may be part of the climate change problem – VentureBeat

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

Efforts to achieve carbon neutrality are on the rise, with a global consensus that climate change is an urgent concern that requires international action.

While carbon intensive industries like manufacturing, transportation and agriculture are used to being on the receiving end of public criticism for their ecological shortcomings, data has emerged as another bottleneck.

In the spirit of Earth Day, it may be time to reconsider more sustainable solutions when it comes to data storage.

Were generating more data than ever before, which, of course, means that data centers are also growing to accommodate increasingly complex IT infrastructures.

But these data storage centers, both on premises and in the cloud, consume massive amounts of energy and electricity. With new devices connecting to the internet every second, according to a recent McKinsey study, data centers account for 1.8% of electricity use in the United States and contribute to 0.3% of all global CO2 emissions.

So if data is flowing like water trying to get out, eventually, itll be nearly impossible to stay ahead of the dam breaking. Luckily, there are steps enterprises can take to plug the holes early and become truly green in the near future.

In the Exponential Climate Action Roadmap laid out by the World Economic Forum, digital technologies could help reduce global carbon emissions by up to 15% one-third of the 50% reduction required by 2030.

On the surface, this sounds like a lofty goal. But, reducing your environmental footprint first requires understanding how your data storage is currently managed, both from an IT perspective and across other departments within your organization. This involves more than purchasing energy-efficient hardware.

Recently, Mendix, a low-code software development platform, and Atos, an IT service and consulting company, expanded their global partnership to drive digital decarbonization with their enterprise low-code solutions.

Atos is already in the process of building industry-specific solutions based on Mendixs platform to monitor, report and track real-time energy consumption and carbon emission across 1,800 locations globally.

At a time when the tech industry is booming, the global software developer shortage is predicted to reach 85.2 million workers by 2030. This developer drought necessitates the need for a paradigm shift in the way companies develop and maintain software applications.

Low-code tools are designed to manage updates through automation, helping to bridge the skills gap by enabling non-programmers to build products without having to learn complex computer languages. This not only frees up IT teams for other vital tasks, but drastically reduces their products time-to-market.

Johan den Haan, CTO of Mendix, notes, The goal of low-code is to deliver maximum value using minimal resources. If you want to decarbonize, you need to digitize. And if you want to digitize, you need to democratize your software with low-code solutions.

In addition to low-code, the term cloud-native computing has become somewhat of a catchall for the various systems and strategies required by software developers to create, deploy and manage modern software applications on cloud infrastructure.

Research from Veritas Technologies shows that 99% of organizations plan to increase workloads in cloud environments over the next three years, with 98% attributing the shift to sustainability strategies. However, storing data in the cloud takes energy to power their servers, storage equipment and cooling infrastructure, which largely comes from burning fossil fuels.

Eric Seidman, senior director at Veritas Technologies, says that evolving backup strategies to include advanced deduplication can help organizations meet their environmental impact goals to shrink their carbon footprints.

At a high level, deduplication refers to a process that eliminates redundant data. Most enterprises create multiple copies of the same datasets mainly to ensure that a backup exists at all times in the event of a hardware failure or security breach.

Though replication provides many benefits, U.S. Grid Emissions Factor data suggests that storing just 1 petabyte of unoptimized backup data in the cloud for one year could create as much as 3.5 metric tons of CO2 waste.

In order to make data storage more efficient and less harmful to the climate, the lifecycle of the data needs to be intelligently managed, said Seidman.

To achieve this, organizations need to determine what data can be eliminated, compressed and/or optimized for storage in the cloud.

Doing so will result in conserved storage capacity, a reduction in idle time and significant cloud storage savings.

The more digitized our economy becomes, the more well continue to rely on data storage centers to support it.

But, the environmental implications behind data creation and storage can no longer be ignored. Key technical decision-makers must find environmentally sustainable ways to bring value to a company without moving at a glacial pace.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

See more here:
Carbon and the cloud: Why data may be part of the climate change problem - VentureBeat

Read More..

Pure Storage wants to work with data gravity, not against it Blocks and Files – Blocks and Files

Pure Storage CEO Charles Giancarlo expressed two noteworthy views in an interview with Blocks & Files that hyperconverged infrastructure doesnt exist inside hyperscaler datacenters, and that data needs virtualizing.

He expressed many noteworthy views actually but these two were particularly impressive. Firstly, we asked him if running applications in the public cloud rendered the distinction between DAS (Direct-Attached Storage) and external storage redundant. He said: In general the public cloud is designed with disaggregated storage in mind with DAS used for server boot drives.

The storage systems are connected to compute by high-speed Ethernet networks.

Its more efficient than creating virtual SANs or filers by aggregating each servers DAS in the HCI (hyperconverged infrastructure). HCI was a good approach generally, in the 2000 era when networking speeds were in the 1Gbit/s area, but now with 100Gbit/s and 400Gbit/s coming, disassociated elements can be used and this is more efficient.

HCIs use is limited, in Giancarlos view, by scaling difficulties, as the larger an HCI cluster becomes, the more of its resources are applied to internal matters and not to running applications.

Faster networking is a factor in a second point he made about data virtualization: Networking was virtualized 20 years ago. Compute was virtualized 15 years ago, but storage is still very physical. Initially networking wasnt fast enough to share storage. Thats not so now. He noted that applications are becoming containerized (cloud-native) and so able to run anywhere.

He mentioned that large datasets at petabyte scale have data gravity; moving them takes time. With Kubernetes and containers in mind, Pure will soon have Fusion for traditional workloads and Portworx Data Services (PDS) for cloud-native workloads. Both will become generally available in June.

What does this mean? Fusion is Pures way of federating all Pure devices on-premises hardware/software arrays and off-premises, meaning software in the public cloud with a cloud-like hyperscaler consumption model. PDS, meanwhile, brings the ability to deploy databases on demand in a Kubernetes cluster. Fusion is a self-service, autonomous, SaaS management plane, and PDS is also a SaaS offering for data services.

We should conceive of a customers Pure infrastructure, on and off-premises, being combined to form resource pools and presented for use in a public cloud-like way, with service classes, workload placement, and balancing.

Giancarlo said datasets will be managed through policies in an orchestrated way, with one benefit being the elimination of uncontrolled copying.

He said: DMBSes and unstructured data can be replicated 10 or even 20 times for development, testing, analytics, archiving and other reasons. How do people keep track? Dataset management will be automated inside Pure.

Suppose there was a 1PB dataset in a London datacenter and an app in New York needed it to run analysis routines? Do you move the data to New York?

Giancarlo said: Dont move the [petabyte-level] dataset. Move the megabytes of application code instead.

A containerized application can run anywhere. Kubernetes (Portworx) can be used to instantiate it in the London datacenter. In effect, you accept the limits imposed by data gravity and work with them, by moving lightweight containers to heavyweight data sets and not the inverse. You snapshot the dataset in London and the moved containerized app code works against the snapshot and not the original raw data.

When the apps work is complete, the snapshot is deleted and excess data copying avoided.

Of course data does have to be copied for disaster recovery reasons. Replication can be used for this as it is not so time-critical as an analytics app needing results in seconds rather than waiting for hours as a dataset slowly trundles its way through a 3,500-mile network pipe.

Giancarlo claimed: With Pure Fusion you can set that up by policy and keep track of data sovereignty requirements.

He said that information lifecycle management ideas need updating with dataset lifecycle management. In his view, Pure needs to be applicable to the very large-scale dataset environments, the ones being addressed by Infinidat and VAST Data. Giancarlo referred to them as up-and-comers, saying they were suppliers Pure watched although he said it didnt meet them very often in customer bids.

Referring to this high-end market, Giancarlo said: We clearly want to touch the very large scale environment that out systems havent reached yet. We do intend to change that with specific strategies. There was no more detail said about that. We asked about mainframe connectivity and he said it was relatively low on Pures priority list: Maybe through M&A but we dont want to fragment the product line.

Pures main competition is from incumbent mainstream suppliers such as Dell EMC, Hitachi Vantara, HPE, IBM, and NetApp. Our main competitive advantage, he said, is we believe data storage is high-technology and our competitors believe its a commodity This changes the way you invest in the market.

For example, its better to have a consistent product set than multiple, different products to fulfill every need. Take that, Dell EMC. Its also necessary and worthwhile to invest in building ones own flash drives and not using commodity SSDs.

Our takeaway is that Pure is bringing the cloud-like storage consumption and infrastructure model to the on-premises world, using the containerization movement to its advantage. It will provide data infrastructure management facilities to virtualize datasets and overcome data gravity by moving compute (apps) to data instead of the reverse. Expect announcements about progress along this route at the Pure Accelerate event in June.

See the rest here:
Pure Storage wants to work with data gravity, not against it Blocks and Files - Blocks and Files

Read More..