Category Archives: Cloud Servers

Esri Announces New Security Enhancements through Integration of US Government-Approved Drone and Cloud Deployment to the European Union – sUAS News

Global Users Facing Restrictions on Drone Hardware or Drone Data Now Have a Complete, End-to-End Solution for Planning, Executing, and Processing Data

REDLANDS, CA October 6, 2020 Esri, the global leader in location intelligence, today announced two major capabilities in Site Scan for ArcGIS that will enable governments and critical infrastructure organizations to meet hardware and software regulations in the US andEurope.Through an established partnership withAuterion, creator of the most widely used open-source drone autopilot operating system, security-conscious US organizations will be able to useSite Scan, Esris unmanned aerial systems flight planning and processing solution, to plan and execute missions with the trusted and secureFreefly Astro drone, powered by Auterion. Additionally, for organizations in Europe with data sovereignty requirements, a new and fully independent instance of Site Scan for ArcGIS has been deployed to European servers, ensuring that organizational data resides within the region.

Site Scan for ArcGIS is used by organizations that require drone imagery for visual inspections, site monitoring, asset management, and situational awareness. Its an all-in-one, cloud-based drone mapping solution for managing fleets and collecting, processing, analyzing, and sharing data products. Industries using this solution include architecture, engineering, construction, natural resources, infrastructure, and government. Core capabilities in Site Scan such as scalability, collaboration, time saving, and now enhanced security functionality provide value to customers.

The US government has recently issued a growing number of advisory warnings and bans on the use of drones that pose security risks. These precautions have adversely impacted federal agencies and private firms that manage critical infrastructure, causing them to adopt incongruous drone data capturing and processing workflows that consist of multiple vendor solutions. Esri can now offer these agencies a single,end-to-end drone solution that integrates FreeflyAstro, using US Department of Defense-approved Blue sUAS software architecture from Auterion, and is fully supported by Site Scan.

Our expertise in providing an enterprise drone platform based on open-source software enabled us to meet the needs of the US government and governments worldwide, said Dave Sharpin, CEO, Auterion Government Solutions. We are very excited to partner with Esri and provide [its] users with our groundbreaking technology.

By law within Europe, data from publicly funded or critical infrastructure projects cannot leave the European Union (EU). To enable a scalable drone workflow, Site Scan for ArcGIS has been deployed to a server cluster in Ireland. European customers that require their data not be transmitted outside the region can now leverage this server cluster to meet project requirements.

The relationship we have established with Auterion is key in being able to offer high-quality, secure drone software to our US customers looking to take advantage of our advanced, secure, drone-based imagery collection and management platform, said Richard Cooke, Esri director of imagery and remote sensing. Additionally, through the development of the EU deployment, an even wider range of customers located in Europe will be able to maintain their data and data processing locally.

The Freefly Astro and Site Scan integration will be available for customers by December 2020. The European deployment of Site Scan is available today. To learn more about the new integration of Site Scan with the Freefly Astro drone or about the EU deployment of Site Scan, contact the Esri sales team atesri.com/en-us/arcgis/products/site-scan-for-arcgis/overview#contactsales.

Go here to see the original:
Esri Announces New Security Enhancements through Integration of US Government-Approved Drone and Cloud Deployment to the European Union - sUAS News

Giving Was Strong the First Half of the Year. Will That Continue? – The Chronicle of Philanthropy

Fundraisers got some well-deserved good news this week. Charitable giving in the first half of 2020 increased by almost 7.5 percent over the first half of 2019.

That marks a big shift from the first quarter of this year, when giving was 6 percent behind the same period in 2019, Eden reports. The second quarter also marked a five-year high in the number of donors and contributions.

The results are from the Fundraising Effectiveness Project, which is managed by the Association of Fundraising Professionals in collaboration with GivingTuesday and analyzes donation data from the Growth in Giving Database.

While donors at all levels have stepped up in a big way during the pandemic, those who gave less than $250 were a major driver of growth. The number of small donations increased 19.2 percent over the first six months of last year. That may be due in part to the $300 universal charitable deduction that was enacted as part of the Cares Act.

It wasn't just small-dollar gifts fueling the growth. The number of midlevel donors, who made gifts of $250 to $999, and major donors, who made gifts of $1,000 or more, increased year over year by 8.1 percent and 6.4 percent, respectively.

But this is 2020. The project's leaders issued a warning along with the positive data.

Fundraisers should be cautious about getting too excited about the uptick in giving in the second quarter, said Lori Hunter Overmyer, chair of the AFP Research Council. Giving almost always decreases in the first quarter, and the continued need for nonprofits services, along with the sluggish economy, could potentially depress giving over the long term, she said.

Hilary Higgins

Ronald McDonald House Charities of Chicagoland and Northwest Indiana has notched some fundraising wins since the start of the pandemic, my colleague Emily Haynes reports.

In May, the charity hastily moved its annual gala online. That event typically raises more than $1 million, but it brought in just $800,000 this year. On the other hand, event expenses went down, too. The charity spent just $35,000 to put on the gala which it redesigned as a monthlong online campaign, including live performances by local bands and a magic show by Ronald McDonald, culminating in a two-hour event streamed on Facebook Live. By comparison, the annual one-night gala usually cost about $300,000 to produce.

Whats more, 132 donors made their first gift to the charity as part of the refigured gala. If we had met in person, our room wouldve been 700 people, says Holly Buckendahl, the organization's CEO. Moving the event online widened the charitys reach, she says. Our audience became endless.

Before Covid-19 hit, the charity had planned to test a $20,000 summer fundraising campaign. It has far exceeded its goal, raising slightly more than $353,000. And while its too soon to tell how many of those gifts came from donors who made their first contributions in May, those new donors did receive email appeals during the summer campaign.

Still, Buckendahl expects this years fundraising revenue will be 25 percent below budget. The charity anticipates dipping into its reserves to make up for the shortfall.

As the critical year-end giving season approaches, Buckendahl says fundraisers arent taking their foot off the gas pedal.

Although some of her colleagues at other nonprofits worry that donors will be tired of fundraising appeals by December if charities start asking for donations now, Buckendahl isnt concerned about that. Her charity is communicating frequently with supporters about how the pandemic is affecting its mission and what its financial needs are.

Your big donors, your year-end donors, donors that give to you all year long they need to know now, and theyll make their choices when they make their choices, she says. We are just making sure our audience and our donor family understand where were at and how were doing.

Learn more about giving during the first half of 2020, and read the full story about fundraising during the pandemic at the Chicago-area Ronald McDonald House.

How did fundraising fare at your organization the first half of the year? Are you confident or nervous as we head into the year-end giving season? Drop me a line, and we might include your comments in a future newsletter.

See the original post:
Giving Was Strong the First Half of the Year. Will That Continue? - The Chronicle of Philanthropy

Three Advantages Of Using Cloud Computing In Business – CIO Applications

Most cloud providers include other services into the per-user costs like internet bills, help desk services, security, and support, which helps businesses reduce their on-site storage resources to cut costs without affecting what the private cloud offers.

FREMONT, CA: Cloud computing is an alternate data storage using traditional data centers server storage that depends on internet-based hosting for data and application. Also known as serverless computing, files and applications are stored in a virtual cloud that enables access across all devices from anywhere using machine learning principles.

Here are three advantages of using cloud computing:

Security

Keeping data storage protected is crucial, and many cloud computing providers make security a priority by providing significant data protection. Public and private cloud providers cannot afford to lose customers with inferior security, therefore providing superior, multi-factor authentication and c=security certificates to patch management. Private cloud providers offer a hands-on approach, making sure practices comply with the companys goals and objectives.

Cost Savings

Cost-saving is fundamental for most companies, and cloud computing storage can offer few benefits to it. Many cloud computing services are hosted by third-party service providers, both public and private cloud solutions charge on a per-user basis. Expenses depend on how many people need access to the cloud, unlike traditional server setup in which the prices are flat, whether there is one or 1000 users.

It also reduces the number of IT FTEs needed to manage servers. All maintenance issues are managed by cloud service providers instead of hiring people to ensure that everything works properly, saving cost and time.

Most cloud providers include other services into the per-user costs like internet bills, help desk services, security, and support, which helps businesses reduce their on-site storage resources to cut costs without affecting what the private cloud offers.

Mobility

Cloud computing resources allow files, programs, applications, and data to be accessible from anywhere with any device with an internet connection. Additionally, it can also negate systems incompatibilities.

The mobility of access can allow employees to work from anywhere, optimizing productivity, and flexibility. It also improves customer satisfaction as it enables customers and clients to have access to information and reliable service, increasing the chances of loyalty to the brand.

Excerpt from:
Three Advantages Of Using Cloud Computing In Business - CIO Applications

How secure is the cloud in 2020? – Techerati

Despite increasing levels of adoption byorganisationsof all sizes, cloud solutions continue to be plagued bymisconceptions about their security. Its still commonly assumed that the cloud offers a less secure optioncompared toon-premises infrastructure.So, how does it really shape up, and what security challenges face the cloud in 2020?

While businessesthatkeep their dataon-site often feel as though they have more control over its security, the flaw in that plan is usually a lack of in-house expertise.Thecyber-skills gapis widely documented, leavingalmost half of UK businesses unable to deal with even basic security tasks.Unless you can afford adedicated,specialisedon-site security team, chances are your data would be as safe, if not safer, storedby a public cloud provider with access to the best resources and expertise.

The UK 2020 Databerg Reportshowsthat the perception of cloud security isslowlychanging. In 2015, 77% of businesses expressed concerns about cloud security,and this has seen someimprovement over the last five years although59% remain unconvinced. According to the report, the likely reason this mistrustpersists is unconnected to the physical capabilities of the cloud. Instead it lieswiththe fact that data stored in the cloud always remains the responsibility of theorganisation, rather than the cloud provider. Ifa data breach occurs, the financial and reputational repercussionsfall directly to theorganisation.

Its therefore important that businesses seek out a vendor they can trust, with the knowledge and expertise to best securetheirsensitivedata. Equally important is that businesses educate their employees onbest-practiceprotocols and procedures human error remains one of the biggest causes of data breaches. With the rise of remote working and BYOD, more and more dataisaccessed via the cloud, making it harder fororganisationsto keep an eye on data security. In 2020,31% of employeestook business information outside of theorganisationvia cloud storage, up from 21% in 2015.

While the cloud does not inherently provide additional risk, it remains a target for cybercriminals.Backinthe beginning of2020, as hackers scrabbled to take advantage of the pandemic,cloud-based attacks rose by 630%between January and April.InVerizons 2020 Data Breach InvestigationsReport however, we can see thatcloud security still comparesfavourablyto on-premises alternatives. This year, cloud assets accounted for 24% of breaches, compared to 70% of on-prem assets.

Of those cloud breaches recorded, 77% involvedcompromisedcredentials.Rather than being a demonstration ofinherentweakness in the clouds security, this serves to illustrate the huge growth in social engineering attacks, such asphishing scams, that aim to steal privileged access credentials. The quickest and easiest way for cybercriminals to access systems (cloud-based or otherwise), credentialtheftis fast becoming one of the worst offenders for causes of data breaches. According to thelatestPonemon Institute Cost of a Data Breach Report, a fifth of all data breaches are now the result of stolen or compromised credentials.Worryingly, this was found to impact the average cost of a breachby almost $1 million.

ThePonemonreport also states that misconfigured cloud servers tie with compromised credentials as the most frequentthreat vector.This is confirmed by Verizons findings which show thatmisconfiguration errors have increased since 2017, to the point where they are now more common than malware and outranked onlyby hacking.

With these statistics in mind, its easy to understand whythose59% oforganisationsremain wary of the cloud.The result ofhumanerror during setup, cloud misconfigurationcan leave data exposed or present vulnerabilities that couldlaterbe exploited by threat actors. Its important to make sure your cloud is configured byexperts, regularly audited, updated and patched.Responsibility and configuration of the cloud is sharedbetweenanorganisationand their service provider, so its important to make sure youre working with the right partner.

UKFast offers a truly unique, tailored approach to cloud hosting, with a range of public and private cloud servers and a team of security specialists on hand to keep your data safe. Speak to a cloud expert today on 0800 073 0317.

Read more:
How secure is the cloud in 2020? - Techerati

Cloud computing is betting on outer space – Mint

The Redmond headquartered company, however, has competition in the skies. Almost five months earlier, International Business Machines Corp. (IBM) had announced a beta of its Cloud Satellite service. But it is Amazon Web Services Inc. (AWS), the cloud computing arm of Amazon.com, which has a head start in space.

Around two years ago, it launched the AWS Ground Station to allow its customers to control their satellite communications, process data, and scale operations without having to build or manage their own ground station infrastructure. On 30 June, AWS said it was establishing a new space unit called the Aerospace and Satellite Solutions.

These are but a few cases in point to demonstrate that leading cloud computing service providers have begun flexing their muscles in space too. But why is there a sudden race to outer space?

According to the International Telecommunication Union (ITU), non-geostationary satellite orbits (NGSOs) such as medium earth orbits (MEO) and low earth orbits (LEO) are being increasingly used worldwide. NGSOs, unlike fixed or geostationary satellite orbits, move across the sky during their orbit around the earth. With space launches becoming more affordable and accessible, a slew of private companies are starting to rely on this new array of satellites.

They are used for applications like weather forecasting, surface imaging, communications, and video broadcasts. However, the data from these satellites need to be processed and analysed in data centres on the ground, which explains the term ground stations.

While the cost of the satellite itself is falling, building and running ground stations can cost up to $1 million or more, according to a recent blog post by Jeff Barr, chief evangelist for AWS. Complex data processing also requires a lot of computing power, and the huge data storage requirements only add to the cost.

Leading cloud computing service providers are now starting to offer satellite operators the option to use these ground stations on a pay-per-use or subscription basis, thus, helping the latter save on capital expenditure costs by employing an operating expenditure model.

View Full Image

These ground stations, thus, can help satellite operators download high-resolution imagery faster, more regularly, and analyse the data with artificial intelligence (AI) toolsall of which results in faster and enhanced monitoring of changing climate patterns, forests and agriculture, among other things.

While Microsoft and IBM are testing their services, AWS Ground Station already has customers such as NASAs Jet Propulsion Laboratory and satellite operators Iridium Communications and Spire Global. It also has private sector customers such as Lockheed Martin, Maxar Technologies and Capella Space.

Lucrative market

The worldwide cloud infrastructure services market continued to surge in the April-June quarter of this calendar year to touch $34.6 billion, according to research firm Canalys. The growth was attributed to the consumption of cloud-based services for online collaboration and remote working tools, e-commerce, remote learning, and content streaming which hit new records during the lockdown.

During this period, AWS was the leading cloud service provider, accounting for 31% share of the total spend. Microsoft Azure came second, followed by Google Cloud and Alibaba Cloud.

The revenue of the cloud unit of Amazon totalled $10.81 billion in the April-June quarter of this calendar year, accounting for 12% of its parents revenue.

Microsoft, on the other hand, said its commercial cloud surpassed $50 billion in annual revenue for the first time" for the quarter ended June 30 (which is also its financial year ending). But it does not spell out what this commercial cloud consists of.

Nevertheless, the space forays will only add to the revenue of all these companies.

Battle lines in India

Space deals will add spice in India too. Indias cloud computing market was estimated at $2.5 billion in 2018, dominated by infrastructure as a service (IaaS) and software as a service (SaaS), according to industry body Nasscom. It is forecast to touch over $7 Billion in 2022.

AWS, Microsoft and Google are leaders on the local turf too. Last August, for instance, Microsoft signed a deal with Reliance Jio Infocomm Limited (Jio)a subsidiary of Mukesh Ambani-owned Reliance Industries Ltd (RIL). The agreement included deploying the Microsoft Azure cloud platform in Jios data centers in locations across India.

This January, Google Cloud signed a deal with Bharti Airtel to cater to small and medium enterprises (SMEs) in India. However, Google said this July that it was pumping in $4.5 billion into Airtels rival Jio Platforms in exchange for a 7.7% stake. Not surprisingly, a month later, Bharti Airtel announced a multi-year agreement with AWS to deliver cloud solutions to big companies and SMEs in India.

According to Alok Shende, Managing Director of Ascentius Insights, the fusion of cloud computing with networking, linked by a satellite, is expected to shave off milliseconds in transferring data from source to destination. This is the holy grail in many applications, more specifically in finance and in mission-critical applications. There are many India-centric applications (like defence and in the stock markets) where this could play a powerful role."

He believes that for Microsoft, particularly, this move opens a new avenue to entrench itself in the enterprise market where it has traditionally been a strong player on the application side but has lost the leadership position in terms of market share for cloud."

Jayanth Kolla, founder and partner of Convergence Catalyst points out that India has always been a strong player in the space sector with the Indian Space Research Organization (Isro) developing and launching satellites at a fraction of global costs. He believes that the Indian governments decision to open up Indias space sector to private players is an encouraging sign.

It has already resulted in Indian space tech startups such as Pixxel, Bellatrix Aerospace, Vesta Space and Agnikul raising over $20 million funding from venture capitalists (VCs) in the last six months. TV media, agriculture, telemedicine and logistics are a few sectors that can benefit from strong satellite communication and space technology development. The ground station services launch by Microsoft and AWS will only expedite this ecosystem development significantly in India," says Kolla.

Sanchit Vir Gogia, chief analyst and founder of Greyhound Research, concurs that the timing of this space move is right since many organizations are now beginning to try new use-cases by tapping into geospatial data (data related to a specific location on earth) that is omnipresent, given the proliferation of devices and edge computing devices.

This space is increasingly getting busy with the likes of AWS and IBM investing money and resources to cater to this opportunity," notes Gogia. He cautions, however: We believe the trick in making such an offering successful is to ensure that it is cheap to start with, since most of these projects are nothing more than trials and, hence, have an extremely high failure rate."

The distributed cloud

Space is just an additional frontier for the leading cloud services providers. It all began when companies, which traditionally used servers for their computing needs, realised that they could lower costs by accessing IT resources over the internet, and paying only for the services they needed, reducing capexa trend we now know as cloud computing.

Many companies today use private clouds (on-premise), public clouds (on a network, typically the internet) and hybrid clouds (combining public and private). User companies, though, became wise and began adopting a multi-cloud vendor approach to avoid being locked in by any single technology or cloud vendor.

With billions of devices getting connected to each other as part of the Internet of Things (IoT) trend, computing is now also getting done at the so-called edge", which simply means near the source of the data.

General Electric Co. (GE), for instance, believes cloud computing is best suited to situations that demand actions such as significant computing power, management of huge data volumes from across plants, asset health monitoring and machine learning. Edge computing, on the other hand, makes sense in places like mines or offshore oil platforms that have bandwidth constraints, which make it impractical or very expensive to transmit data from machines to the cloud.

During his speech at the Ignite event, for instance, Nadella pointed out that Microsoft was extending Azure from under the sea to outer space". He was referring to Project Natick that aims to serve customers in areas near large bodies of water. Natick uses AI to monitor signs of failure in its servers and other equipment.

Going forward, Microsoft says it will explore powering a Natick data center by a co-located ocean-based green power system, such as offshore wind or tide, with no grid connection".

Similarly, other than deploying internet balloons in space to provide broadband services, Google also provides services to companies like Planet Labs Inc. The US-based aerospace and data analytics company uses Google Cloud platform to process all of its satellite images and Google Cloud storage to host its image archive.

These moves have given rise to a trend called Distributed Cloud, which research firm Gartner describes as distribution of public cloud services to different physical locations".

By 2023, posits a 22 January note by Gartner, the leading cloud service providers will have a distributed ATM-like presence to serve a subset of their services for low-latency application requirements... Micro data centers will be located in areas where a high population of users congregates, while pop-up cloud service points will support temporary requirements like sporting events and concerts."

Greyhound Research believes offerings such as ground stations will be highly valuable in the next wave of investments in more distributed computing environments. More than 7 in 10 of our end-user inquiries with global majors have confirmed that organizations, in the next 3-5 years, will use a large variety of computing environments and make them more contextual to the use-case," says Gogia. This change is likely to be paced multiple times, given the investments in edge networks and 5G that allow remote sites in utilities, oil and gas, manufacturing, and many other scenarios," he adds.

The distributed cloud market is forecast to reach $3.9 billion by 2025, growing at a CAGR of 24.1% during the forecast period from 2020-2025, according to market research firm, IndustryARC. Security, though, remains a concern if proper protocols and policies are not adhered to in a distributed cloud.

For now, though, ground stations that cater to satellite companies will remain one big component of the distribution cloud. A race is clearly on and all the main players are looking up at the sky.

Leslie DMonte is a consultant who writes on the intersection of science and technology

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Read the original here:
Cloud computing is betting on outer space - Mint

VMware wants to play nice with Nvidia DPUs Blocks and Files – Blocks and Files

VMware and Nvidia announced yesterday they are working to make VMware software work better with Nvidia chips. They say the joint initiative, dubbed Project Monterey, will introduce a new security model that offloads hypervisor, networking, security and storage tasks from the CPU to the DPU.

The aim is to offload hypervisor, networking, security and storage tasks from a host CPU to Nvidias BlueField data processing unit (DPU). This should be useful for AI, machine learning, and high-throughput, data-centric applications, according to the companies.

Nvidia CEO Jensen Huang said in the launch announcement:Nvidia DPUs will give companies the ability to build secure, programmable, software-defined data centres that can accelerate all enterprise applications at exceptional value.

Paul Perez, SVP and CTO, Infrastructure Solutions Group at Dell Technologies, also provided a statement: We believe the enterprise of the future will comprise a disaggregated and composable environment.

Dell said VMware Cloud Foundation will be able to maintain compute virtualization on the server CPU while offloading networking and storage I/O functions to the SmartNIC CPU. VMware has taken the first step to achieve this by enabling VMware ESXi to run on SmartNICs.

A SmartNIC or DPU is a programmable co-processor that runs non-application tasks from a server CPU, so enabling the server to run more applications faster. DPUs can compose disaggregated data centre server compute, networking and storage resources. They can also function as intelligent network interface cards that provide security services and network acceleration.

Nvidias BlueField-2 is a Mellanox system-on-chip (SoC) that integrates a ConnectX-6 Dx ASIC network adapter with a PCIe Gen 4 x16 lane switch, 2 x 25/50/100 GbitE or 1 x 200GbitE ports, and an array of 8-core, 64-bit Arm processors. This provides an integrated crypto engine for IPsec and TLS cryptography, integrated RDMA and NVMe-oF acceleration, and dedupe and compression.

Three use cases are envisaged. First, BlueField-2 can be used with disaggregated storage, which it virtualizes and enables remote, networked storage to be part of a composable infrastructure. Second, BlueField-2 can provision bare metal servers as a CSP operator service to cloud tenants.

VMware said it will re-architect VMware Cloud Foundation to enable disaggregation of the server including support for bare metal servers, a new Cloud Foundation facility. It will enable an application running on one physical server to consume hardware accelerator resources such as FPGAs from other physical servers.

With ESXi running on the SmartNIC, customers will be able to use a single management framework to manage all their virtualized and bare metal compute infrastructure.

Thirdly, BlueField-2 can be used for micro-segmentation at endpoints to isolate application workloads and their resources from each other.

There is a security aspect to Project Monterey. Each SmartNIC is capable of running a fully-featured stateful firewall and advanced security suite. Up to thousands of tiny firewalls will be able to be deployed and automatically tuned to protect specific application services that make up the application.

Project Monterey is available as preview code.

VMware is collaborating with Intel, Nvidia and Pensando, and system vendors Dell, HPE and Lenovo to deliver Project Monterey systems. Dell said it could deliver automated systems using SmartNICS from a broad set of vendors.

DPU suppliers include three startups: Fungible, Nebulon, and Pensando. Pensando recently announced it will provide its DPU as a factory-supported option on HPE servers across the VMware Cloud Foundation product line, including vSphere, VSAN, and NSX. Customers will be able to access Pensandos platform directly within VMware hardware.

Separately, VMware announced at VMworld 2020 yesterday that it is jointly building a deployment platform for VMware-controlled servers to run AI software on attached Nvidia A100 GPUs. The platform combines VMwares vSphere, Cloud Foundation and Tanzu container orchestration software with Nvidias NGC software.

NGC (Nvidia GPU Cloud) is a website catalogue of GPU-optimised software for deep learning, machine learning, and high performance computing. NGC software is supported on a select set of pre-tested Nvidia A100-powered servers expected from leading system manufacturers.

Read more:
VMware wants to play nice with Nvidia DPUs Blocks and Files - Blocks and Files

Industry Groups Spar Over NDAA Provisions on Sourcing of Electronics from China – Nextgov

Manufacturers and assemblers of printed circuit boards are standing apart from other major industry groups in praising sections of the National Defense Authorization Act that would require defense contractors to use less and less of such equipment from adversarial nations over time.

For years, domestic industry has diminished in size and power while other countries, including China, have invested heavily in bolstering their own industrial capabilities, reads a Sept. 29 letter IPC, a trade association of the manufacturers, sent to the chair and ranking members of the House and Senate Armed Services committees. As a result, DoD today relies on nonallied producers for [printed circuit boards and printed circuit board assemblies] in areas including cloud servers, IT, and telecom networks. This continued reliance on untrusted foreign suppliers for [printed circuit boards and printed circuit board assemblies] poses numerous risks to national security.

Both the House- and Senate-passed versions of the NDAA would require Defense contractors to use increasingly more of such equipment from U.S. manufacturers or those of allied countries. Under the House bill 100% of printed circuit boards and printed circuit board assemblies would come from those covered countries by 2033. The Senate bill calls for full sourcing from covered countries by 2032 and 25% of the equipment coming from trusted countries by 2023.

The Senate bill also explicitly bars procurement of the equipment from China, Russia, Iran and North Korea, all of which have been designated as posing world-wide threats by the intelligence community.

We urge you to speedily resolve any remaining issues between the House and Senate while keeping in mind the strong protections that passed both chambers with overwhelming support and without any vocal opposition, IPC wrote.

A conference committee which will reconcile the House and Senate bills before a final vote on the legislation has not yet been formed, staff from the office of Rep. Jim Langevin, D-R.I., chairman of an Armed Services subcommittee on emerging threats, told Nextgov.

The NDAA provisionsSec. 808 in the Senate bill and Sec. 826 in the House billdid spur opposition from a broader group of companies in advance of those negotiations.

ARWG remains concerned with the broad applicability and programmatic impact of the House and Senate provisions related to printed circuit board (PCB) procurement, reads a Sept. 24 letter the Acquisition Reform Working Group also sent to the leaders of the Armed Services Committees in both chambers.

ARWG includes the Associated General Contractors of America, the Information Technology Industry Council, the Computing Technology Industry Association, the National Defense Industrial Association, the American Council of Engineering Companies and the United States Chamber of Commerce.

ARWG recommends the conferees direct the Secretary of Defense to implement a design verification standard to ensure that [printed circuit boards] present no national security risk regarding counterfeiting, quality, or unauthorized access, the associations collectively wrote. Subsequent to this submission, ARWG will provide specific recommendations on these matters separately.

The House and Senate bills both contain a number of ways companies might receive waivers from the provisions from the Secretary of Defense, including if the secretary determines the covered equipment poses no significant national security concerns regarding counterfeiting, quality, or unauthorized access.

Chris Mitchell, IPCs vice president for global government affairs told Nextgov the request for a design verification standard and the provisions in the NDAA bills are not mutually exclusive, noting we should and have been working to develop standards along the lines of whats laid out in the [ARWG] letter.

He added that IPC separately has a trusted supplier standardIPC 1791that the group believes is a model for DOD. And he stressed that while the design verification standard has merit, it doesnt fully address either the security issue or those of the resilience of the industrial base in general, with 55% of printed circuit board production happening in China.

A Sept. 16 blog post from IPC President John Mitchell asserts, The opposition fears the new requirements will disrupt their established supply chains in countries that are not affirmatively covered.

The status quo may be advantageous to some, but this is not a compelling enough reason to nullify a major step forward for American manufacturing capabilities, the IPC letter reads. [the NDAA provisions] would create new high-skilled workforce opportunities for U.S. workers and provide trusted supplies to the U.S. government to use in critical applications.

More here:
Industry Groups Spar Over NDAA Provisions on Sourcing of Electronics from China - Nextgov

Hewlett Packard Enterprise Becomes the Only Major Server Manufacturer to Ship World’s Most Secure Industry-Standard, Made-in-USA Servers – Business…

SAN JOSE, Calif.--(BUSINESS WIRE)--Hewlett Packard Enterprise (HPE) today announced it is delivering the highest level of security1 for a growing number of U.S. federal and public sector customers that prefer U.S. sourced products with verifiable cyber assurance by expanding and further securing its supply chain. HPE is the only major server manufacturer to produce made-in-USA industry-standard servers.2 The new servers include advanced security features that are built by vetted HPE employees in highly secure U.S. facilities as part of the HPE Trusted Supply Chain initiative launched today.

The new HPE ProLiant DL380T server is shipping today to U.S. customers as the first industry-standard server to be produced through the HPE Trusted Supply Chain process, which HPE plans to expand to the rest of its portfolio in 2021.

HPE Expands Industry-leading Security Capabilities from the Edge-to-Cloud

HPE is the worlds leading supplier for trusted computing by being the first vendor to embed silicon-based security into its industry-standard servers. Its exclusive silicon root of trust technology runs in over two million servers globally today and has formed the foundation for HPEs vision for securing the enterprise to deliver experiences from the edge-to-cloud, platform as-a-Service.

HPE also delivers the industrys most advanced, embedded network security with Arubas high-performing, highly reliable and secure wired and wireless network infrastructure solutions. Additionally, HPEs recent acquisition of Scytale further extends zero trust capabilities by standardizing and accelerating service authentication across cloud, container, and on-premises infrastructures.

Customers turn to HPE to accelerate innovation and power any application need while gaining data protection throughout the lifecycle, said John Grosso, vice president of Global Operations Engineering, Supply Chain, at HPE. With the new HPE Trusted Supply Chain, we are furthering our commitment to deliver the highest level of security in all of our server products by diversifying our supply base with U.S. sourcing and applying stronger security at the factory floor. We are ensuring that our customers will have full visibility and cyber assurance of their servers to focus resources on deploying their new solutions and optimizing their IT experience.

Responding to Customer Demand for U.S. Sourced and Secure Servers

The HPE Trusted Supply Chain is another step forward in HPEs ongoing mission to provide customers with the highest level of cyber assurance to ensure they receive verifiably authentic and uncompromised products and solutions by further strengthening security from the time servers are manufactured to prevent tampering or compromised products.

The HPE Trusted Supply Chain supports U.S. customers across federal, public sector, banking and financial services, and healthcare organizations that require highly secure products sourced in the U.S. It also addresses customer demands for an additional supply base to increase resiliency and identify and reduce risk in the midst of COVID-19 that has impacted supply chains globally.

New HPE servers that are part of the HPE Trusted Supply Chain will offer the most comprehensive, end-to-end data protection by featuring a pre-installed layer of hardened security before the server is shipped to the customer.

Extending End-to-End Security for Entire Product Lifecycle

HPE further extends its security capabilities in a server from distribution and shipping, through its complete lifecycle while it is still active. The new features are built on top of the HPE-exclusive, silicon root of trust security technology, which has been recognized for the ability to reduce risk by insurers in the new Cyber Catalyst program from Marsh Insurance. Hardened security features activated during the manufacturing process will offer the following benefits:

Securing the Human Factor

HPE will also assign its employees, with verified background and security checks, to build products produced through the HPE Trusted Supply Chain that adheres to the strictest sourcing, inspection and traceability standards.

Availability

In 2021, HPE plans to expand production through the HPE Trusted Supply Chain to include its other servers and systems. HPE will make additional made-in-Europe choices available for European customers in 2021.

All new HPE servers produced through the HPE Trusted Supply Chain will be offered as-a-Service through HPE GreenLake for a highly secure cloud experience.

HPE GreenLake offers customers with subscription-based, agile and elastic capabilities while keeping their data on-premises for security, data sovereignty, compliance, visibility, and cost controls.

Additional Resources

About Hewlett Packard Enterprise

Hewlett Packard Enterprise is the global edge-to-cloud platform-as-a-service company that helps organizations accelerate outcomes by unlocking value from all of their data, everywhere. Built on decades of reimagining the future and innovating to advance the way we live and work, HPE delivers unique, open and intelligent technology solutions, with a consistent experience across all clouds and edges, to help customers develop new business models, engage in new ways, and increase operational performance. For more information, visit: http://www.hpe.com.

Read the original:
Hewlett Packard Enterprise Becomes the Only Major Server Manufacturer to Ship World's Most Secure Industry-Standard, Made-in-USA Servers - Business...

Privacy, civil rights groups demand transparency from Amazon on election data breaches | TheHill – The Hill

A group of more than a dozen privacy and civil rights organizations on Thursday demanded that Amazon disclose information about breaches of election data in order to increase the companys public transparency ahead of November.

Groups including Color of Change and Demand Progress cited past incidents reported by Reuters that involved voter data, stored on Amazon Cloud servers, being left exposed online in expressing concerns about Amazon security. They noted that one or more of Amazons election services will be used in 40 states this year.

Amazons election services - including running election websites, storing voter registration information and ballot data, and helping to provide live results on election night - concentrate private voter data and history in a single centralized system, the groups wrote in a letter to Amazon CEO Jeff BezosJeffrey (Jeff) Preston BezosSenate panel votes to subpoena Big Tech executives Privacy, civil rights groups demand transparency from Amazon on election data breaches NASA's Bridenstine: We really are going to the lunar south pole MORE. A single breach could have catastrophic consequences for election integrity in dozens of states.

While the groups acknowledged that the election officials involved would have some responsibility for election data security, they warned that this did not mean Amazon should abdicate all involvement.

If a car seat manufacturer didnt provide proper instructions to make sure people installed their car seats correctly, and it put infants in harms way, people would blame the manufacturer for its negligence, the groups wrote. The car seat manufacturer would be expected to do everything necessary to prevent infants from dying.

Other groups that signed on to the letter were the AI Now Institute, Constitutional Alliance, Fight for the Future, Just Futures Law, Kairos Action, Media Alliance, MediaJustice, MPower Change, Open Markets Institute, RootsAction.org, Partnership for Working Families, Secure Justice, STOP The Surveillance Technology Oversight Project, Woodhull Freedom Foundation and X-Lab.

The letter was sent just over a month ahead of the presidential election and as election interference concerns increase.

A senior official at the Office of the Director of National Intelligence warned in an assessment last month that Russia, China and Iran were actively interfering in U.S. elections this year.

Microsoft warned this month that it was seeing Russian, Chinese and Iranian hackers target political groups including the campaigns of President TrumpDonald John TrumpJaime Harrison debates Graham behind plexiglass shield Doctors, White House staff offer conflicting messages on president's health Trump given second dose of Remdesivir 'without complication', 'not yet out of the woods', Conley says MORE and former Vice President Joe BidenJoe BidenPost-debate poll finds Biden with leads in two key states Democrats warn Supreme Court confirmation would endanger senators' health, call for delay Sunday shows preview: Trump COVID-19 diagnosis rocks Washington, 2020 election MORE.

The groups pointed to the Microsoft assessment in noting that adversaries were actively looking for loopholes, like cloud security compromises to interfere in the presidential election.

Given the stakes, Amazon should be doing everything they can to secure our elections, the groups wrote. Responding to these disturbing compromises by placing blame and putting the onus on the user, like you have in the past, is unacceptable.

Evan Greer, the deputy director of Fight for the Future, criticized Amazon for not doing enough to shore up the security of its systems.

Amazon is the most profitable corporation on the planet and Jeff Bezos is the richest person in human history, Greer said in a statement. If a company this size wants to sell its software to governments for election purposes, it has a responsibility to ensure that those systems are properly configured and to be transparent about the steps theyre taking to secure our elections. We cant let Amazons greed corrode whats left of our democracy.

Read more here:
Privacy, civil rights groups demand transparency from Amazon on election data breaches | TheHill - The Hill

What is the Importance of ROI in Enterprise Application Integration? – CIOReview

In this fast-paced business environment, modern enterprises adopt newer and smarter applications to better their business workflows.

Fremont, CA: Businesses depend on various applications like accounting software, analytics platform, CRM systems, HRMS systems, and more to manage their essential day-to-day operations. The integration of these enterprise applications provides transparency and order to organizations business processes as they unveil the existing systems hidden potential. While modern businesses are keen on adapting to this trend to enhance their business workflows, it is vital to consider the said integrations ROI impact.

It is a critical component of the CIO to focus on integration merits and then make a case of how an EAI tool aids in saving software development time and maintenance costs. They have to showcase how the integration supports the business strategy by enhancing customer service, optimizing inventory, and bettering the products go-to-market time by standardizing the business processes.

Typically, there are two kinds of integration strategies: for cloud applications and on-premise applications. Enterprise applications can either be hosted on-premise, private/public cloud servers, or managed by third-party service providers. With a high number of applications being deployed on the cloud within an organization, a hybrid integration framework is coming to optimize for both applications. This reduces initial and ongoing IT costs and maximizes their value over time.

Traditionally there are three kinds of application integration models.

Traditional on-premise integration platforms- These platforms can be utilized to integrate cloud applications via the deployments tend to be costly with a longer deployment time and offers limited flexibility to support changes over time without specialized expertise or consultation.

Tactical point-to-point connectors- Subscription to the needed and prebuilt cloud connectors helps connect discrete cloud application data fields in other cloud and on-premise applications. It covers low initial and a stable ongoing cost with a rapid time to deploy. Though they are generally not architected for enterprises as they fail to meet the scalability and reliability requirements

Ad-hoc, project-based integration- Leveraging the existing employee skill sets and developer tools or external consultation, companies can connect cloud and on-premise applications. This approach doesnt scale over time and often requires specialized knowledge for upgrades or making changes.

See also:Top Cloud Consulting/Services Companies

Read this article:
What is the Importance of ROI in Enterprise Application Integration? - CIOReview