Category Archives: Cloud Servers

Customer trends require technology to leverage – Smart Business Network

Even in a world dominated by digital technologies, some companies still hesitate to adopt tools that could take their business to the next level. Thats often preventing businesses from engaging in developing customer trends, most of which increasingly rely on digital technologies to leverage.

Smart Business spoke with Jim Altman, Middle Market Pennsylvania Regional Executive at Huntington Bank, about some burgeoning digital trends, as well as strategies to take advantage of them.

What digital transformation trends are taking hold?

Companies today are using personalization to differentiate customer experiences. The experiences customers want differ based on who they are and what they need. With greater personalization and customization, companies can segment their approach so that narrower customer segments get catered products, experiences and services.

Unlocking the collected customer data companies hold goes a long way to understanding the uniqueness that a business can provide its customers. Every company collects a great deal of customer data, but many dont use that data to drive experiences. Its important that companies learn to analyze the data that they capture, in part so they can drive more personalized experiences.

Another trend has companies creating ecosystems that serve as a platform for broader interaction, either between a company and its customers, or with customers and other companies. Companies are starting to figure out who they need to partner with often because those partners work outside the companys first line of expertise to provide value to their customers, in part by offering multiple services in one place.

Why might businesses hesitate to adopt newer technologies?

Often business leaders, when presented with a new technology, arent eager to employ it. They may understand that the tool offers a more efficient way to perform a task, but they resist, instead relying on the way theyve always done it because, in their mind, its easier.

Additionally, as more applications are based in the cloud, some companies avoid using them because they are uncomfortable relying on cloud services. Thats typically out of concern for security, as some companies believe the cloud is not as secure as software hosted on on-premise servers. While this might seem to them like a prudent risk mitigation strategy, it often means the company misses out on opportunities to save costs through increased efficiencies, or to better connect with customers.

Overcoming that requires educating leadership about the pros and cons of cloud-based applications, making communication key. It takes working with someone who is able to explain the applications, how they work, and how to establish procedures that mitigate as much risk as possible.

Once the benefits and risks are clearly communicated, businesses should start small. For instance, use one cloud application to begin with and work into more as the organization gets more comfortable. Companies tend to think they need to move all applications to the cloud at once. But starting small can help the initiative gain credibility at all levels of the business, which instills confidence and helps the organization move to the next stage. The process can begin by creating a transition plan that identifies specific applications to move to the cloud one at a time. Once most people are comfortable, it can really take off from there.

How can companies mitigate technology risks?

A lot of risk can be mitigated by working with external partners. For example, there are a lot of fintechs that have created niche businesses that can help companies in very specific areas. For companies facing a risk thats outside their comfort zone, partnering with one of these up-and-coming companies that have mastered a specific functionality where that risk is found can help mitigate security concerns while connecting with valuable expertise.

Look for ways to partner with entities that have expertise and have already assumed certain risks. There are so many businesses that specialize in certain areas that companies often dont need to develop products, software or procedures internally because specialized partners can do it quicker and with less risk.

Insights Banking & Finance is brought to you by Huntington Bank

Read this article:
Customer trends require technology to leverage - Smart Business Network

Google Cloud and SAP Take Companies to The Cloud in New Partnership – Somag News

Google Cloud: The German company SAP SE, creator of business management software, announced, on Thursday (29), the strengthening of a strategic partnership with Google Cloud. Under the agreement, the American cloud platform becomes part of the Germans RISE with SAP program, which promises its customers a holistic transformation into an Intelligent Company.

Extending the terms of an agreement signed in 2017, the two companies become cloud partners to accelerate the process of migrating customers and business processes to the new technology. The environment brings together computing services, including servers, storage, databases, and the use of Artificial Intelligence (AI) and Machine Learning (ML).

The investment in cloud services can represent a reduction in operational costs, while the reduction of a physical infrastructure implies gains of scale. The objective of the celebrated partnership is for companies to be able to execute transformations in their business, migrate critical business systems to the cloud, and have access to the sophisticated tools of SAP and Google Cloud.

Responsible for product engineering at SAP, executive Thomas Saueressig explains that RISE with SAP is already a globally recognized program adopted by customers who want to accelerate the journey to becoming smart companies. The expansion of the partnership with Google Cloud adds the powerful infrastructure of the American company, and IA and ML services to the portfolio for those seeking cloud services.

In practice, this means that RISE with SAP program solutions, such as SAP Analytics Cloud and SAP Data Warehouse Cloud, currently housed on the German companys business technology platform (SAP BTP), will migrate to Google Clouds reliable and scalable infrastructure , which still offers a high-speed network.

Some pilot projects, such as Energizer Holdings Inc. and MSC Industrial Supply, already access the RISE with SAP option within the Google Cloud.

Excerpt from:
Google Cloud and SAP Take Companies to The Cloud in New Partnership - Somag News

Here’s What We Know About Google’s Tensor Mobile Chip So Far – iPhone in Canada

Tensor is the first system-on-chip (SoC) designed by Google, which will make its first appearance in the companys next-generation Pixel 6 and Pixel 6 Pro flagship smartphones set to release later this year.

Google CEO Sundar Pichai noted in a statement to Engadget that Tensor has been four years in the making and builds off of two decades of Googles computing experience. The company also says its newcomputational processing for video feature debuting in upcoming Pixel 6 phones was only possible with its own mobile processor .

Tensor is an obvious nod to the companys open-source platform for machine learning, TensorFlow. The chip is powerful enough to run multiple AI-intensive tasks simultaneously without a phone overheating, or apply computational processing to videos as theyre being captured.

The company isnt giving away all the details about the processor yet, nor is it sharing specific information about its latest flagships now. But theres a lot of new stuff here, and we wanted to make sure people had context, Rick Osterloh said. We think its a really big change, so thats why we want to start early.

You can learn more about the development ofGoogles new Tensor mobile processor at the source page.

Link:
Here's What We Know About Google's Tensor Mobile Chip So Far - iPhone in Canada

Top 10 Data Center Stories of the Month: July 2021 – Data Center Knowledge

Jim Whitehurst Left IBM Because He'd Rather Be CEO - Contrary to what some may have feared, the exec's departure wasn't a sign that IBM was trying to make Red Hat more like IBM, reneging on its promise earlier.

Xilinx Wants to Flip the CPU-Accelerator Relationship Upside Down - The new Versal HBM accelerator is a kind of self-adapting micro-supercomputer for servers and server clusters.

Why CISAs China Cyberattack Playbook Is Worthy of Your Attention - The advisory outlines the tactics, techniques, and procedures Chinas state-sponsored cybercriminals use to breach networks.

Equinix Sees 5G as a Chance to Seed a New Interconnection Ecosystem - Blurring of the boundaries between 5G and cloud infrastructure may be an opportunity for the colocation giant.

Vertiv CEO Rob Johnson On the Pandemic, Supply Chain Woes, and Data Center Tech - The Data Center Podcast: Managing through a global health crisis and parts shortages amid a demand crunch.

Security Problems Worsen as Enterprises Build Hybrid and Multicloud Systems - Under pressure to accelerate cloud adoption, IT orgs often skip over crucial security planning steps.

Microsoft Pledges to Emit Zero Carbon By 2030 - Taking another climate moonshot, Microsoft has set out to crack a problem that will require a complete rethink of the worlds electric grids.

Vertiv CEO Says Data Center Supply Chain Crunch Is Driving Up Costs - Rising component and materials costs and skyrocketing data center demand make a perfect storm.

The Kaseya Ransomware Attack Is a Wakeup Call for MSP-Reliant IT Shops - The pandemic has driven more outsourcing to MSPs, making them prime targets for cybercriminals wanting to scale their attacks.

Azrieli Makes Bigger Data Center Bet with Green Mountain Acquisition - The Israeli developer has agreed to acquire the Norwegian data center operator for $850 million.

Originally posted here:
Top 10 Data Center Stories of the Month: July 2021 - Data Center Knowledge

RMIT to Implement Dedicated Cloud Supercomputing Facility on AWS to Boost Research Capabilities – HPCwire

July 30, 2021 The Royal Melbourne Institute of Technology (RMIT) cloud supercomputing facility is designed to help more researchers and students within RMITs industry hubs including Industry 4.0, advanced manufacturing, space, fintech, digital health, and creative technologies to innovate beyond the limitations of on-premises HPC infrastructure and accelerate time-to-science.

The cloud supercomputing facility will use AWS to provide elastic, secure, and scalable cloud infrastructure forresearchers and students within RMITs industry hubs including Industry 4.0, advanced manufacturing, space,fintech, digital health, and creative technologiesto run high performance computing (HPC) applications with seamless access.

Workloads such as genomic sequencing, autonomous vehicle simulations, and atmospheric modelling are often too large to run using traditional servers. HPC on AWS provides virtually unlimited compute capacity that meets the infrastructure requirements of almost any application, allowing researchers to process huge volumes of data to help solve some of the worlds most complex challenges in far less time from disease prevention, extreme weather forecasting, and citizen safety.

RMIT will leverage AWSDirect Connectwhichenables customers to have low latency, secure and private connections to AWS for workloads which require higher speed or lower latency than the internet.The increased bandwidthwill giveresearchers, students, staff, andindustrypartnersthe ability to experiment and test new ideasand discoveriesinvolvinglarge data setsat speed, fast-tracking the time between concept andproducts thatRMITare ready to take to market.

RMIT will also collaborate withtelecommunicationsprovider,AARNet, which will providehigh-speedinternet and communicationservices,andglobal technology company,Intel,foritsadvanced technology solutionsto process, optimise,store,and movelarge, complicateddata sets.

RMITDeputy Vice-Chancellor (STEMCollege) andVice President Digital Innovation,ProfessorAleksandarSubicsaid the facility, supported by theVictorian Government Higher Education Investment Fund,is a pioneering example of innovation in theuniversitysector.

Ourcollaborationwith AWS, Intel,and AARNET toestablishAustralias firstcloudsupercomputing facilityrepresents a step change in how universities and industriesaccessHPCcapabilitiesfor advanced data processing and computing,Subic said.

Byleveraging AWS Direct Connect,RMITis set toaccess tremendous HPC processing powerusing a unique service modelthat provides seamless access to allourstaff, researchers, and students.

Ourindustry partnerswill also haveaccess to thenewcloudsupercomputing facility through joint projects and programs.

The facility will be operated by our researchers and studentsin another example that shows how industryengagement and work integrated learning are in our DNA.

AWSDirector andCountryLeaderfor Worldwide Public Sectorin Australia and New Zealand, Iain Rouse,said AWS helps researchersquickly analyse massive amounts of data, and share their results with collaborators around the world.

With access to the broadest and deepest portfolio of cloud services, RMIT can innovate beyond the limitations of on-premisescomputing,and keep up with scientific advancesworldwide.

We are proud to support ground-breaking research initiatives in collaboration with RMIT,which is set to enable researchers, students, and industry acrossabroadrange of sectorstodesign solutions and bring them to market sooner, all of which wouldnt be possible at the speed and scale without the elasticity of the cloud.

AARNetCEO Chris Hancock saidAARNethad provided RMIT and other Australian universities with leading-edge telecommunications services to enable transformational research outcomes for decades.

Weve also been connecting researchers to the cloud for many years, but nothing on this scale, he said.

Were excited to be partnering with RMIT on this project that uses our ultra-fast network to remove the barrier of geography and distance for research across Australia and beyond.

RMITs newSchool of Computing Technologies, a centre for digital innovation, world-class research, and education in science, technology, engineering, and mathematics, launched earlier this year. The centre will support the development and operation of the cloud supercomputer, building on its sector-leading capabilities in cloud technologies.

Source: Amelia Harris, RMIT

See more here:
RMIT to Implement Dedicated Cloud Supercomputing Facility on AWS to Boost Research Capabilities - HPCwire

Morphisec is Named A Finalist for the Top 10 Black Unicorns – PR Web

Morphisec Logo

BEER SHEVA, Israel and BOSTON (PRWEB) August 02, 2021

Morphisec, a leader in cloud-delivered endpoint and server security solutions, has been named as a finalist for the Top Ten 10 Black Unicorns in the Black Unicorn Awards for 2021. The Cyber Defense Black Unicorn Awards showcase Black Unicorns, in a category that judges cybersecurity companies with the potential to be valued at $1B within the next 36 months.

Morphisec was chosen as a finalist among the top ten Black Unicorns from an extensive number of submissions from the most promising cybersecurity companies. All entrants are targeting a public offering or being acquired for more than $1B. The recognition comes on the heels of Morphisec raising $31M earlier this year to enable every business to simply and automatically prevent the most dangerous cyberattacks.

Midsized enterprises are historically underserved by the cybersecurity market and their challenges have increased exponentially in the last year as the pandemic and work-from-home has made perimeter-based security irrelevant, said Morphisec CEO Ronen Yehoshua. Morphisec has proven to be the only cybersecurity solution capable of providing affordable, simple, and effective protection for their distributed workforces against a seemingly endless amount of ransomware, malware and evasive attacks.

Today, Morphisec protects more than 8 million endpoints and workloads in a low-cost, automated, and deterministic fashion. Morphisec comes to these organizations defense without needing dedicated security teams to respond to and investigate attacks automatically stopping the most dangerous attacks targeting workstations, VDIs, servers, virtual machines, and cloud workloads.

Its exciting to see Morphisec making it into our Top Ten Black Unicorns among other cybersecurity industry leaders, said Judges Robert R. Ackerman Jr. of http://www.allegiscyber.com, David DeWalt of http://www.nightdragon.com and Gary Miliefsky of http://www.cyberdefensemediagroup.com.

About Morphisec

Morphisec is the world leader in providing advanced security solutions for midsize to small enterprises around the globe. The companys security products simplify and automatically block modern attacks from the endpoint to the cloud. Unlike traditional security solutions relying on human intervention, Morphisec delivers operationally simple, proactive prevention. This approach protects businesses around the globe with limited security resources and training from the most dangerous and sophisticated cyber attacks.

About the Cyber Defense Black Unicorn Awards

This is Cyber Defense Magazines 9th year of honoring cybersecurity innovators through the Black Unicorn Awards for 2021 on its Cyber Defense Awards platform. In this competition, judges for these prestigious awards includes cybersecurity industry veterans, trailblazers and market makers Gary Miliefsky of CDMG, Robert R. Ackerman Jr. of Allegis Cyber and David DeWalt of NightDragon with much appreciation to emeritus judge Robert Herjavec of Herjavec Group. To see the complete list of finalists for the Black Unicorn Awards for 2021 please visit https://cyberdefenseawards.com/black-unicorn-awards-for-2021-the-winners/

Share article on social media or email:

Follow this link:
Morphisec is Named A Finalist for the Top 10 Black Unicorns - PR Web

Azure Sentinel in the Real World – Virtualization Review

Azure Sentinel in the Real World

Smaller organizations need the same IT security services as larger businesses but without the corresponding price tag, says Paul Schnackenburg, so he decided to "build a SIEM for SMBs" on a shoestring budget.

Back in mid-2019 we looked at Azure Sentinel (then recently released), Microsoft's cloud-based Security Information and Event Management (SIEM). In this article I'll guide you through a real-world Sentinel deployment for one of my clients, lessons learned and some thoughts around SMB cybersecurity in general.

Stuck in the MiddleMy business has been providing IT services to SMBs since 1998 so I know the challenges and limitations of the "smaller end of town" intimately. The move to cloud is completed for most of my clients, with some still in a hybrid world with a few workloads on-premises.

Just like larger businesses, SMBs feel the pressure of shrinking IT budgets, the challenge of the Covid pandemic and, most of all -- the changing cybersecurity landscape. But there's no way that they are going to be able to afford a full-blown Managed Detection and Response (MDR) solution, backed by a 24/7/365 Security Operations Center (SOC).

So, I do what I can -- I deploy centrally managed antimalware on each endpoint, I ensure they have a business class firewall for their offices, I provide security awareness training and simulated phishing campaigns and I configure their cloud services according to best practices. I also make sure they have solid backups, with copies stored off site. But my concern is the same as many larger organizations, the lack of visibility -- if (when) they're compromised we won't know about it until it's too late. It's the same dilemma as always: SMBs need the same IT services as larger businesses but without the corresponding price tag.

When I saw that Sentinel provided several free data sources (Azure activity, Office 365 audit logs and alerts from the Microsoft 365 Defender suite) as long as I don't retain it for longer than 90 days and that Sentinel has connectors for nearly every data source, I decided to see if I could "build a SIEM for SMBs" on a shoestring budget.

The client I started with is an independent school with approximately 90 students, from year 1 to year 12, plus about 20 staff. They have Microsoft 365 A3 (equivalent to E3 in the commercial world) deployed to all staff and students and two on-premises Dell Hyper-V hosts running Windows Server 2019 with a total of seven VMs. The newer server runs all VMs, and the older server is in a separate building as a Hyper-V replica target for DR. The VMs are two DCs, a file/print server, a LOB app, Windows Server Update Services (WSUS), Microsoft's Advanced Threat Analytics (ATA) and a Linux syslog server (more on that last one later).

Connecting Data SourcesI set up an Azure account for the client, based on the same Azure Active Directory as their Microsoft 365 tenant, and deployed a Log Analytics workspace with Azure Sentinel on top of it in the Australia East region (I always use https://www.azurespeed.com/ to make sure I host resources in the closest region whenever possible). I set the retention to 90 days (as that's free), but I know that many security professionals will probably choke on their morning coffee reading that because it severely limits the ability to find intruders with long dwell times -- many organizations (and regulatory frameworks) require several years of retention. But the aim here is to fit within a small budget and provide visibility to catch the bad guys early, so 90 days it is.

Next, I configure Data connectors (there are 116 to pick from at the time of writing with more being added each week) -- Azure AD, DNS, Office 365, Security Events, Threat intelligence -- TAXII and Windows Firewall.

Two of those are simple cloud connectors, just provide Global Administrator (or Security Administrator) credentials and pick what to ingest, here's the configuration for AAD:

Most connectors come with workbooks for visualization; here's a workbook for Office 365:

These give you a way to visualize and dig into normal activity by your users, in this case what they do across OneDrive, Exchange, SharePoint and Teams.

The DNS, Security events and Windows Firewall connectors rely on log data from the on-premises VMs and hosts. On each of them I installed the Microsoft Monitoring Agent (MMA) and configured them with the workspace ID and primary key from the Log Analytics workspace. This is a simple install If you have servers that don't have internet connectivity, you can use the Log Analytics gateway to proxy the uploads, but that's not an issue at this client. If I was going to do this again, I would instead opt for the newer Azure Monitoring Agent (AMA) as it's the future log collecting agent across both Windows and Linux. One benefit of AMA are data collection rules that let you filter to collect only specific log entries using XPath queries, but fortunately the Security events connector with MMA lets you filter on Minimal or Common (or all) events. I picked Common.

Again, in an ideal world I would deploy the agent on all client endpoints as well for full visibility of all security events across all nodes (which I'll do at my next client who only has 10 client devices and a NAS file server), but at this client I'll need to watch the ingestion cost carefully before expanding log collection.

The last data connector is Threat intelligence -- TAXII, which is one way to ingest TI data into Sentinel. Based on this blog post I connected to Anomali's free intel feed to get data on new ransomware domains/IPs, malware domains, TOR nodes, C2 servers and compromised hosts.

Rules, Log Queries and WorkbooksOnce the data is in Sentinel it's time to mine it for suspicious activity. Sentinel comes with hundreds of built-in analytics rules templates -- I looked through the list (filtering on High and Medium severity) to start with and enabled all of them that relied on the data connectors we have.

Here's an example of one such rule -- Rare RDP Connections -- which identifies when a new or unusual connection is made to any of our servers (RDP is only available on the internal network).

I've set up most of these rules to run once a day to generate alerts.

When you're trying to understand the data you have you can use Logs to explore the different data sources and tables. Here I'm looking at the data coming back from Windows Firewall on the servers.

Most connectors come with a set of workbooks, which is another way of visualizing the data. Here's the Insecure Protocols workbook, aggregating legacy protocols in use across AD and AAD in the last seven days.

AutomationAlerts in the portal are great -- once you see them you can start an investigation to determine if this is really a malicious issue that needs further investigation or a benign false positive. But as mentioned, there's just me and I certainly have better things to do than sitting and staring at a portal UI all day. Each alert rule has the ability to configure an automated response when it's triggered, but I didn't want to have to set up and maintain this for each rule, so I created a single Logic App that catches any alert and emails it to me (and the local IT teacher at the school).

View post:
Azure Sentinel in the Real World - Virtualization Review

Sending data to the cloud, NC makes long-awaited election system updates – WRAL.com

By Jordan Wilkie, Carolina Public Press

By the end of the summer, all 100 county boards of elections in North Carolina will be rid of the computer servers that hold voter registration data. The information will be stored in the cloud instead.

This is an early step in what will be a years long and nearly $3 million process to upgrade state and county election systems to improve security, usability and efficiency, according to the N.C. State Board of Elections.

The state will upgrade its voter registration and back-end data management, which are essential for running elections but little seen or understood by voters. The changes will not affect voting machines or the election equipment that makes, scans and counts ballots.

Originally designed in 1998 and put in place statewide in 2006, North Carolinas current election information management system is made up of a network of data servers in the state office and every county, woven together by a network of computer programs.

That was almost another geological era of cybersecurity risk management, according to John Sebes, co-founder and chief technology officer at the nonprofit Open Source Election Technology Institute. Back then, election administrators were not worrying about computer hacks from foreign nations or even criminals looking to make a buck.

We have to recognize its not just the technology front thats evolved so much; its the threat, Sebes said.

The scope of the projects shows how election administration has evolved since the turn of the century. Running elections now requires handling ever more data managed through increasingly complex voting technologies, all while protecting against the kinds of cybersecurity threats that challenge major corporations and thefederal government.

Updates planned over the next three years will make cybersecurity practices more consistent across all 100 county boards of elections, streamline updates to the back-end systems, write new software for use at the county and state levels, and replace the state servers with new hardware, according to Brian Neesby, chief information officer for the State Board of Elections.

Moving voter registration data from county servers to the cloud lays the foundation for all the other changes.

This is a big step toward the implementation of modernization as opposed to talking about modernization, said Derek Bowens, Durham Countys election director.

The State Board of Elections often consults with county election directors on elections improvements, and Bowens said he hopes that the board will consult with directors like him in the process of designing the new election management system.

Several other agencies have an interest in how the State Board of Elections runs. The board coordinates its security stance with other state agencies, like the Department of Information Technology and the Department of Public Safety, and with federal agencies, including the National Guard, FBI and Department of Homeland Security.

When complete, the states election infrastructure will be more resistant to computer attacks, and managing election data should be easier, Neesby said. Election security experts agree, with one important caveat: if it is done correctly.

North Carolina is adding Microsoft into the mix to take advantage of the kind of computer servers and security that only a multinational tech company can provide.

The states plan to move the counties voter registration systems to the cloud by the end of the summer means putting the data on Microsofts servers to be accessed remotely in each county.

Overall, the migration will improve our security posture because we will limit the surface area of attack; the cloud will allow us to exert easier control over our security practices, Neesby wrote in an email to CPP.

Relying on companies like Microsoft can be a two-edged sword, according to Duncan Buell, chair emeritus in computer science and engineering at the University of South Carolina. If the contract is written well and the software that will connect the counties to the cloud is secure, the move will likely be an improvement.

But since Microsoft serves some of the most important government and commercial clients, it is a huge target, Buell said.

They will be attacked by everybody, and they have been attacked by everybody, Buell said.

In the past, governments have been hesitant to store sensitive state data on a private companys computers, Sebes said, because it raises questions about data ownership and custodianship.

But in the age when the technological capacity of companies like Microsoft far outpace what local governments are able to offer and with the development of specific products for government use, states are getting over old fears, Sebes said.

All youre really losing control of is where the hardware lives and who does the physical security, and who does the personnel security for the physical data center staff, Sebes said. Thats a reasonable amount to give up.

In March 2020, malware froze Durham Countys website and many of its computer systems. The hack did not seem to target the countys voting systems, which were not directly affected. However, since the attack happened so close to the primary election and threatened to delay post-election audits, it raised alarms.

The county Board of Elections installed a localized version of its election management system and was quickly able to overcome other inconveniences like disabled phone lines and limited access to emails. In the end, the audits were only slightly delayed.

This kind of computer attack has become more common across the country and is just one of many ways hackers can inject chaos into an election.

Though the ultimate impact on Durhams primary election was minimal, the incident showed the importance of running election systems independently of other government systems, creating redundancy in the system and establishing backup plans in place should something fail.

Doomsday scenarios include scrambling the voter registration system so it is impossible to know who can vote, cutting power to the grid in major cities or a successful disinformation campaign convincing enough people to not trust the election results. Another worst-case scenario is the much-discussed but low likelihood of a hack into voting machines.

Though Russian state hackers probed voter registration databases in all 50 states in 2016, successfully infiltrated Illinois system and compromised one voting systems contractor, no votes were changed, and the outcome of the election was legitimate.

The sudden awareness of foreign nations attempts to interfere in U.S. democracy sparked a dramatic responsefromCongressandfederal agencies. The federal government found state election systems were vulnerable and needed significant security upgrades.

In January 2017, the Department of Homeland Security designated election infrastructure as part of the nations critical infrastructure, meaning it is among the most important systems keeping the country functioning.

The designation and its accompanying changes fueled a consensus among election experts and the federal and state governments that the 2020 elections were the most secure elections ever heldin the United States.

There is also consensus that states still have room to improve.

Counties and the Board of Elections have wanted to upgrade the voter registration system for the better part of a decade, Neesby said, but they did not have the funding or the staff.

Election officials werent originally thinking about security. They just wanted to make a cumbersome system more streamlined, according to Neesby.

But with the cybersecurity threat to U.S. democracy laid bare in 2016, the federal government increased funding and resource sharing with states to shore up their election systems.

At the moment, federal funds through the Help America Vote Act pay for almost two-thirds of the state boards IT personnel, according to a statement from the boards spokesperson, Pat Gannon, which he released opposing the Senates proposed budget, which would cut off these resources.

Without federal funding, the state could not modernize its election management system, Gannon wrote.

A funding loss would put the state in a difficult position, as its current systems are outdated and subject to predictive faults, memory and functional limitations and inadequate reliability, according to the part of the state boards IT report focused on replacing state servers.

North Carolina is not an outlier in using old technology. Without funding or an external force, state governments are often reluctant to upgrade their election systems, Buell said.

South Carolina only upgraded its voter registration software after the state bought new electronic poll books that didnt work with the old programs, Buell said. He served on the Richland County Board of Elections for two years and worked with the League of Women voters to advocate for election security.

North Carolina counties will not experience much change when the state board transfers the registration data to the cloud, according to Sara LaVere, elections director for Brunswick County. The login process and interface will change a little, and the state will dispose of the county server, she said.

The bigger changes will be phased in step by step over the next couple of years, according to Neesby. The state is rewriting software for the entire registration management system, designed for use in the cloud.

The current system was built in 1998, which is an antiquated coding and software platform that has reached end-of-life for software and hardware functionality, the state boards IT report reads. Modernization is necessary for functional and security reasons.

Editor's note: This article first appeared online at Carolina Public Press, an independent, in-depth and investigative nonprofit news service for North Carolina that allows WRAL News and other media outlets to republish its work.

Read the original:
Sending data to the cloud, NC makes long-awaited election system updates - WRAL.com

Cloud Storage vs. Local Storage: Which Storage Solution is the Best for Your Business? – Enterprise Storage Forum

If we had to define the most common question we get asked, it would be: Which storage solution is best for my business? Businesses have been asking us this since cloud storage became feasible and affordable for small businesses. Unfortunately, it doesnt have an easy answer. Or, at the very least, it doesnt have an easy answer at the level of your whole business.

In reality, virtually every business out there makes use of both types of storage, at least to some degree. The formal name for this approach is hybrid storage, though most companies using it wouldnt use the term for the now-mundane mix of public cloud storage, on-device temporary storage, and long-term hard drive backups.

Nevertheless, for those new to the subject or those who need a quick refresher course, its possible to quickly summarize the differences between these types of storage.

Generally speaking, local storage is defined as data that you store on-premises. This includes every flash, hard, or backup drive you have, no matter how small; the hard drive in your personal laptop (or even on your phone) is technically local storage if you are taking it into work.

Local storage is great in some ways, and not so good in others.

Also read: Top NAS Storage Devices for 2021

Cloud storage is a type of storage where your data is not stored in your own servers, but instead you access files and programs over an internet connection. In slightly more technical terms, its when you access any information stored in a datacenter via the internet.

You can check out our guides to how cloud storage works if you want to know more, but for most small business owners there is a more important question: Is cloud storage worth the cost?

Also read: Best Cloud Storage for Business & Enterprise 2021

because you are limited by the speed of your internet connection. While you are only likely to notice this if you are transferring large amounts of data, it can be an issue in growing businesses.

As you can see from the lists above, both types of storage have pros and cons. Most businesses make use of both simultaneously, mixing local storage of critical, high security files with cloud servers that allow staff remote access to the data they use everyday.

In short, if you are not already using cloud storage for at least a portion of your storage needs, you should know that it will more than likely save your business significant costs.

Read next: Best Hybrid Cloud Storage Vendors & Software 2021

Read the original here:
Cloud Storage vs. Local Storage: Which Storage Solution is the Best for Your Business? - Enterprise Storage Forum

Key aspects of RPA in the cloud – TechTarget

Robotic process automation started as a more efficient way to write macros on desktop computers. However, many enterprises had difficulties scaling it beyond a few software robots called "bots."

As interest grows in the automation technology, robotic process automation (RPA) seeks to move these bots from desktop computers to the cloud. RPA in the cloud could simplify the infrastructure, improve scalability and provide better integration with other cloud applications.

Cloud providers are seeing the benefits of RPA and are acting accordingly. Recently, Microsoft revamped its Power Automate tooling for the cloud and Google invested in Automation Anywhere, one of the leading RPA vendors. Additionally, all the major RPA vendors have been busy refactoring their offerings to run more efficiently in the cloud.

Learn how RPA works in the cloud, its deployment model, automation workflows and the importance of governance.

In many ways, RPA in the cloud works similarly to how it works on premises. IT teams create bots that can learn and then execute rules-based business processes. In addition, many RPA offerings can observe human digital actions and then design a bot to complete these actions, automating the bot creation process itself.

In the cloud, RPA can take advantage of cloud-native architectures, security models and scalability more efficiently than it can on desktop or on-premises servers. Designing the technology, design approval, security reviews and the actual establishment of RPA platforms can be a major cause of delays when kicking off a digital workforce program in a larger organization -- but the cloud option reduces those delays significantly, said Maurice Dubey, executive director at Q4 Associates and author of Adopting a Digital Workforce.

With architectures built on containerized microservices and serverless infrastructure, cloud RPA provides better scalability, said Amardeep Modi, practice director at Everest Group, an IT advisory firm.

Containers reduce the time required for configuration and setup, as well as simplify auto-scalability without manual intervention. This lowers resource utilization since IT teams can scale up and down microservices' underlying RPA capabilities independently, rather than the traditional approach of replicating the whole servers. These factors can lead to a lower total cost of ownership.

One important aspect of cloud RPA is the deployment model. RPA infrastructure can be dynamically scaled up in the appropriate cloud platform to be closer to other applications. This reduces the burden on IT staff that had to traditionally manage physical servers. All the RPA vendors have created cloud-specific automations that improve the provisioning and management bots on the cloud.

For example, the Blue Prism RPA platform currently supports deployments on AWS, Google Cloud Platform, Microsoft Azure, IBM, Oracle Cloud Infrastructure and Salesforce AppExchange. It also enables IT teams to integrate bots into cloud-native services, such as cognitive services and productivity applications. Another aspect of cloud support lies in tapping into an ecosystem of integration partners that specialize in different cloud platforms or business domains.

Cloud RPA changes the way RPA bots are deployed and integrated into other applications. Traditionally RPA bots have connected to other applications by clicking and scrolling in the application's user interface. The cloud makes it easier to create hybrid bots that are programmed like traditional bots but can also automate tasks via APIs.

The rise of hybrid bots running in the cloud could combine the ease of development and understanding of RPA with the traditional performance and scalability of low-code/no-code development.

For example, Microsoft's Power Automate calls these hybrid automation cloud flows. These allow bots to control apps via direct API integration rather than trying to mimic keystrokes and mouse clicks, which are less scalable and prone to break when the UI changes.

One of the biggest attractions of RPA, compared to low-code development tools, is that application development mirrors the way users traditionally interact with applications. This makes it easy for business users to understand what is going on and to use process mining techniques to create the starting point for new automations. APIs and low-code development tools typically require expert knowledge to use.

The rise of hybrid bots running in the cloud could combine the ease of development and understanding of RPA with the traditional performance and scalability of low-code/no-code development.

Cloud RPA also allows enterprises to change the way they govern automations, even for ones that automate desktop UI of an application running on a local PC. This centralizes administration and governance so that administrators get greater visibility into everything that is created and run by users in the organization.

Also, IT leaders need to ensure governance is implemented appropriately.

"By giving RPA to citizen developers, we may end up with the Microsoft Access scenario where organizations' IT departments ended up having to support a lot of poorly designed and developed Access solutions, some of which ended up managing mission-critical aspects of their business," Dubey said. He believes that baking some of the governance capabilities into cloud RPA could help reduce this risk.

Deloitte's Kruger believes that cloud RPA could also help organizations manage risk and operational support better than low-code/no-code automation. He predicted that demand for cloud RPA will grow faster as more organizations move to cloud-native applications in general. This acceleration will be fueled by independent and professional developers creating and sharing automations through the cloud.

See original here:
Key aspects of RPA in the cloud - TechTarget