Category Archives: Cloud Computing
MissionOG Exits Cloud Computing Analytics Provider Cloudamize – Markets Insider
PHILADELPHIA, Aug. 3, 2017 /PRNewswire/ --MissionOG announced today the successful exit of Cloudamize, a leader in cloud computing analytics, following the completion of an acquisition by Cloudreach, a leading global cloud enabler for enterprises. Together the companies provide best-in-class software and services for the migration and management of cloud infrastructure.
Cloudamize is a leading technology platform that can automate in-depth cloud infrastructure data collection and analytics.The solution collects billions of data points, analyzes them, and presents the findings in a simple way so that customers can make data-driven decisions.
The growth and successful acquisition of Cloudamize validates a MissionOG investment thesis that enterprises will accelerate the migration of applications and storage from data centers to public cloud, and will also require detailed analytics to efficiently manage the cloud lifecycle.
"The founders, Khushboo and Stephan, and the Cloudamize team have consistently innovated their platform based on market needs and delivered high value to their enterprise clients," noted George Krautzel, managing partner of MissionOG. "By providing an easy means to understand complex infrastructure data, they have simplified the migration process and helped enterprises start to realize the potential of the cloud."
"We greatly valued the investment and strategic guidance from George and the MissionOG team," said Bob Moul, CEO of Cloudamize. "MissionOG's operational support and strong network made them an ideal investment partner." MissionOG led several investment rounds for the company. The proceeds from the exit represented a 52% IRR.Krautzel served on the board of directors along with MissionOG advisor Jonathan Palmer.
According to Gartner, spending on Infrastructure-as-a-Service will expand from $24 billion in 2016 to $75 billion in 2020.Cloudreach and Cloudamize are respective leaders in accelerating this growth in adoption.
MissionOG has invested in 15 companies since its founding in 2012.The firm invests in technology and software-related B2B businesses with a specific focus on financial services, payments, SaaS, and data platforms.
AboutCloudamizeCloudamize is a cloud computing analytics platform that provides data analysis and recommendations to speed and simplify cloud migration and cost management. Our platform helps you choose your best-fit cloud vendor; automates application discovery and dependency mapping to design a precise migration plan; analyzes your performance metrics and usage patterns on an ongoing basis to ensure your cloud is always right-sized; and provides clear visibility into cloud costs for better control. Armed with these insights, you can more quickly make accurate cloud decisions, achieve cost-performance optimization, and maximize the value of your cloud investments. Cloudamize is proudly located in Philadelphia. For more information, please visithttps://www.cloudamize.com.
About MissionOGMissionOG provides capital to early and growth stage B2B technology companies. We apply our experience and capabilities to a group of highly skilled and passionate entrepreneurs whose businesses are on the cusp of exponential growth. Headquartered in Philadelphia, MissionOG is managed by entrepreneurial operators who have effectively built early to growth stage businesses and guided them through successful acquisitions. For more information visithttp://www.missionog.com.
Media Contact
Sophie KayeEmail: rel="nofollow">sophie@missionog.comPhone: 267-239-5869 x104
View original content with multimedia:http://www.prnewswire.com/news-releases/missionog-exits-cloud-computing-analytics-provider-cloudamize-300499282.html
SOURCE MissionOG
See original here:
MissionOG Exits Cloud Computing Analytics Provider Cloudamize - Markets Insider
Cloud Computing and Collaborative MBSE Comes to HyperWorks – ENGINEERING.com
Custom cloud computing appliances can be accessed from most devices. (Image courtesy of PBS Works.)
Users can mix and match resources to configure their HPC
Engineers can configure and manage cloud resources using PBScloud.io. (Image courtesy of PBS Works.)
You can try PBScloud.io for free at this link, or learn more about the solution on the PBS Works website.
Engineers can connect engineering analysis with system models by adding the ModelCenter tool from the HyperWorks simulation platform. Users can create and automate a multitool workflow within the framework of ModelCenter to integrate analysis performed using different platforms and using tools from multiple vendors. Users can select the optimal analysis or simulation tool from preferred vendors and interconnect the input and output data to form a single unified workflow.
Engineers can customize workflows with ModelCenter. (Image courtesy of Altair.)
Engineers can connect the engineering analysis to the systems model using the ModelCenter MBSEPak. With this model-based systems engineering (MBSE) approach, users can combine preprocessing, solving, post-processing, visualization and reporting tools from various sources to enable multidisciplinary analysis, validation and simulation.
Users can integrate their preferred tools from nearly any software application into the ModelCenter workflow. For example, user-generated tools, legacy FORTRAN programs, C++ applications, spreadsheets, mathematical models, databases, models from CAD tools, CAE models, and others can be combined into the workflow. Users can then optimize the workflow to use HPC resources, iterate part or all of the workflow with different data, and manage the results. Engineers can very quickly explore and optimize performance, cost, reliability and risk for a number of different design alternatives using this approach.
Here is the original post:
Cloud Computing and Collaborative MBSE Comes to HyperWorks - ENGINEERING.com
Cloud Computing in Industrial IoT: Market for Cloud support of IIoT by Software, Platforms, Infrastructure (SaaS … – Business Wire (press release)
DUBLIN--(BUSINESS WIRE)--The "Cloud Computing in Industrial IoT: Market for Cloud support of IIoT by Software, Platforms, Infrastructure (SaaS, PaaS, and IaaS) and Outlook for Centralized Cloud and Fog Computing for IIoT Devices and Objects/Things 2017 - 2022" report has been added to Research and Markets' offering.
Cloud Computing is moving beyond the consumer and enterprise markets into support for manufacturing and industrial automation of other industry verticals. The Industrial Internet of Things (IIoT) represents a substantial opportunity for both the centralized cloud as a service model for software, platforms, and infrastructure as well as distributed computing often referred to as Fog Computing wherein IIoT edge computing will avail industry of real-time processing and analytics.
This research evaluates the technologies, players, and solutions relied upon for Cloud Computing in IIoT. The report analyzes the impact of SaaS, PaaS, and IaaS upon IIoT as well as Cloud Computing software, platforms, and infrastructure in support of edge computing. The report also assesses market opportunities for Cloud Computing support of IIoT Devices and the Objects/Things that will be monitored, actuated, and controlled through IoT enabled processes. The report includes detailed forecasts for the global and regional outlook as well as by industry vertical, devices, and objects/things from 2017 to 2022.
Target Audience:
Companies Mentioned
Key Topics Covered:
1 Overview
2 IIoT Cloud Computing Ecosystem
3 Industrial IoT Cloud Computing Market
4 IIoT Cloud Connected Devices/Things Forecasts
5 Company Analysis
For more information about this report visit https://www.researchandmarkets.com/research/93wpcp/cloud_computing
Originally posted here:
Cloud Computing in Industrial IoT: Market for Cloud support of IIoT by Software, Platforms, Infrastructure (SaaS ... - Business Wire (press release)
Cloud Computing Market in Latin America 2016-2020 – Key vendors are Amazon Web Services (AWS), Microsoft, IBM … – Business Wire (press release)
DUBLIN--(BUSINESS WIRE)--The "Cloud Computing Market in Latin America 2016-2020" report has been added to Research and Markets' offering.
The analysts forecast the cloud computing market in Latin America to grow at a CAGR of 10.16% during the period 2016-2020.
Cloud Computing Market in Latin America 2016-2020, has been prepared based on an in-depth market analysis with inputs from industry experts. The report covers the market landscape and its growth prospects over the coming years. The report also includes a discussion of the key vendors operating in this market.
One of latest trends in the market is growing momentum of cloud brokerage services. Cloud service brokers act as intermediaries between cloud service providers and businesses that integrate their infrastructures with cloud-computing platforms.
These brokers provide management and maintenance services to enterprises. They understand the enterprise requirements and provide them with a set of vendors that meet their current requirement and to some extend future needs, as well.
Cloud brokers also provide consulting services and assistance in selecting the right vendor. After procuring solutions, cloud brokers assist clients in integrating application with the existing computing infrastructures.
Along with installation and maintenance, cloud service brokers also deliver deduplication, security, and data protection services to enterprises.
Key vendors
Other prominent vendors
Key Topics Covered:
Part 01: Executive summary
Part 02: Scope of the report
Part 03: Market research methodology
Part 04: Introduction
Part 05: Market landscape
Part 06: Market segmentation by services
Part 07: Market segmentation by IT deployment model
Part 08: Geographical segmentation
Part 09: Key leading countries
Part 10: Market drivers
Part 11: Impact of drivers
Part 12: Market challenges
Part 13: Impact of drivers and challenges
Part 14: Market trends
Part 15: Vendor landscape
Part 16: Key vendor profile
For more information about this report visit https://www.researchandmarkets.com/research/nb8smj/cloud_computing
Read this article:
Cloud Computing Market in Latin America 2016-2020 - Key vendors are Amazon Web Services (AWS), Microsoft, IBM ... - Business Wire (press release)
Amazon Miss Sparks Debate: Will Video Pay Off Like Cloud … – Investor’s Business Daily
Amazon.com's (AMZN) surprise second-quarterearnings miss has analysts debating the margin impact of its far-flung investments, and while cloud computing seems a good bet to pay off, internet video may be raising some eyebrows.
XAutoplay: On | OffThe difference: Amazon Web Services is by far the biggest provider of cloud computing services, leading Microsoft (MSFT), Google-parent Alphabet (GOOGL) and others.
In web video, Amazon aims to catch up with Netflix (NFLX), Google's YouTube, Time Warner's (TWX) HBO and others. Amazon has stepped up investments in original content and has expanded video streaming services overseas.
Shares in the e-commerce giant were down 2.5% to 1,020.04 on the stock market today after Amazon's revenue topped views, but profit fell far short of estimates. That's off intraday lows of 1,001 and above Amazon's recent buy point of 1,016.60.
Michael Olson, analyst at Piper Jaffray, maintains an overweight rating on Amazon stock.
"An irony in the Amazon margin story is that we don't believe investors would disagree with any of the initiatives to which Amazon is committing significant capex or opex dollars," he said in a note to clients.
At Stifel, analysts Scott Devitt isn't sure about that.
"Amazon is spending aggressively on everything under the sun," Devitt said in a report. "It may all work but it is clear that investment levels will be heightened in the near term. The most recent investment cycle created a buying opportunity once certain investments received reduced allocations based on limited success, China and mobile phone. We see a lot of positive in this cycle but note two areas that could eventually be viewed as problematic by investors: video and various new retail initiatives such as grocery."
The e-commerce giant views internet video as a recruitment tool for its customer-loyalty program, Amazon Prime. The subscription service, which costs $99 annually, has 80 million U.S. members. Prime encourages shoppers to buy more goods, with free delivery and other perks.
IBD'S TAKE:Amazon, anIBD Leaderboardstock, has retreated from abuy range after reporting earnings. It had cleared a late-stage flat base with a 1,016.60 buy point.ItsIBD Composite rating has dipped to 97 out of a possible 99.
"Amazon strongly believes that its video content offerings efficiently enable conversion of customers into paid Prime subscribers and lead to greater spend per subscriber," said Mark Mahaney, a RBC Capital analyst in a report.
Aside from cloud computing and online video, Amazon's other big investments include retail grocery, fulfillment-center buildouts, engineering staff and salespeople, artificial intelligence and overseas markets like India. "The fulfillment and AWS waves may dissipate, but the video investment seems early," said Michael Graham, a Canaccord Genuity analyst in a report.
RELATED:
Amazon Second-Quarter Earnings Have A Big Miss On Bottom Line
Is The Holy Grail Of Digital Payments Just One Click Away?
Here's One More Thing Amazon Is Killing
See original here:
Amazon Miss Sparks Debate: Will Video Pay Off Like Cloud ... - Investor's Business Daily
RMU offers new course in Amazon cloud computing with eyes on … – Pittsburgh Post-Gazette
Pittsburgh Post-Gazette | RMU offers new course in Amazon cloud computing with eyes on ... Pittsburgh Post-Gazette Robert Morris University will begin offering Amazon Web Services training, which could prepare its IT students for a career in cloud computing. |
Excerpt from:
RMU offers new course in Amazon cloud computing with eyes on ... - Pittsburgh Post-Gazette
The dangers of cloud computing: Is your information safe? – NY … – New York Daily News
The Cloud. You may have heard of it and no, I dont mean those fluffy things in the sky.
But what is it? How does it work? And what does it mean for you and your family?
You may be surprised to realize that you and your family likely use the cloud almost every day. Have you uploaded a photo to Instagram? Well, its now in the cloud. How about backing up your iPhone to the iCloud? Thats a little obvious, but yes, youve backed up your photos, contacts, and notes to a cloud.
The cloud is the latest way of storing information and using services on the internet, but its origins go back to the 1950s. At that time computers were expensive, and users needed to find a way to share the limited and valuable processing time. This idea of sharing is the foundation of cloud computing.
North Korea threatens the world with cyberwarfare, not nukes
Rather than have each individual user have her own terminal with its own software, the software or information is centrally located. She logs into her account, accessing the information in the office, on the road, or at her local caf.
Its important to realize the cloud is not a physical thing per se. You cant wrap your arms around it. Come to think of it, you cant do that to a real cloud either!
MIKE ROGERS EXPLAINS THE CLOUD ON THE LATEST EPSIDODE OF OUR WEBSHOW WORLD WAR E. WATCH ABOVE.
The cloud exists in massive server farms. You probably have driven past them and not realized it. They are massive warehouses filled with networks of servers running software, applications, and storing information. They are pretty cool to look at, with miles and miles of bundled cables, flashing LEDs, and whirring fans to keep the temperature down.
Amazon Web Services, or AWS, is one of the largest providers of cloud services. Social media sites like Pinterest and Instagram host their platforms on AWS servers. Even the CIA and NSA use AWS to host some of its I.T. infrastructure.
So why do companies and organizations go to the cloud? Two primary reasons.
First, it saves a ton of money. Rather than buying computers and software licenses for every individual employee, you can pay a subscription fee. One example is Adobes creative tools like Photoshop. When that employee leaves, you just cancel her subscription. The same is true if you expand. Add another employee, add another subscription. Its just that easy.
It also saves companies by centralizing services such as cybersecurity, something weve talked a lot about here at World War E. Before, you needed to have an on-site I.T. expert who was the one-stop-shop for all things computer. She would respond if your computer froze or your email turned off, but she would also patrol the digital walls to make sure hackers didnt breach the perimeter.
The problem here is that she needed to do that for all the computers, all the devices, and the entire I.T. infrastructure. Now scale that up to a multi-national company with offices on six continents. It gets complicated, and fast.
With the cloud, services such as cybersecurity and I.T. management are pushed up the chain to a central location where it is easier and cheaper to manage.
Second, with the cloud you can access your information from anywhere in the world. Want to work from home on your couch? You can do that with the cloud. Need to show the family photos to grandma in Michigan? Send her a link to Flickr. You dont have to send the information, photos, or videos over the internet anymore.
How secure is the cloud? Well-- as with any technology there are issues. Centralizing your personal information and phone back-ups makes some people wary. If these cloud services are breached, you could lose a lot more than just some photos and funny pet videos. Thankfully companies like Amazon and Google are fully aware of this and are placing a high premium on security and encryption.
The cloud is an exciting technology and one that is changing the way we live and work. As with anything, we need to adopt basic cyber hygiene practices when using the cloud. Frequently change your password and use solid passwords. And no, password is not a good password. Have backups on your computer or external hard-drive in addition to the cloud. Nothing is foolproof. But you can help yourself by being smart online.
Original post:
The dangers of cloud computing: Is your information safe? - NY ... - New York Daily News
The central role of the server in open networking – Cloud Tech
Open networking is a hot topic these days. When we read about open networking products and initiatives, the emphasis is on network switches more often than not. But server-based networking has also proceeded along an increasingly open path, and in many ways it set the stage for the opening of switch technology.
Network switches like top of rack (TOR) switches have traditionally been closed they come from specific vendors with proprietary software. Networking in commercial off the shelf (COTS) servers has been open for several years, thanks to the proliferation of Linux server operating systems (OSs), and networking technologies like Open vSwitch (OVS). The networking industry wants the switch world to follow servers successful path; hence the birth and popularity of the term open networking.
Switches have traditionally been closed the network operating systems and protocols that run on the switches have been proprietary, could not be disaggregated from the hardware and were not open source. At first, switches were really closed because the switch ASIC, the software and the switch box were all from a single vendor and were proprietary. Then, switches got disaggregated a bit when the switch vendors adopted switch ASICs from merchant silicon vendors like Broadcom. Next came OpenFlow and OpenFlow-based SDN controllers like Floodlight, which proposed that the switch control plane protocols be removed from the switch and placed in an open source controller. This in some ways disaggregated the OS from the switch box.
Subsequently, switch operating systems like Cumulus Linux came into the market. These are disaggregated because they can install and run on merchant switch ASIC-based switch boxes from multiple vendors like Quanta and Dell. But such disaggregated switch OSes are not necessarily open source.
More recently, open source switch operating systems like SONiC and Open Network Linux have been in the news. The open source controller ecosystem has further evolved as well, focusing on feature completeness and carrier grade reliability (i.e. OpenDaylight and ONOS).
All in all, significant action and news in the realm of open networking have been related to switches, geared toward helping the industry manage the switch supply chain more effectively and deploy efficiently, similar to the COTS server model.
Figure 1: Switch disaggregation follows server model
What seems to get overlooked in these discussions about open networking is the all-important precursor to this movement open networking on servers. Most importantly, how open networking on servers (or server-based open networking) has evolved and enabled open networking on switches.
Over the last several years, TOR switches have become simpler because data centre traffic patterns have changed and networking infrastructure efficiency requirements have increased. When using leaf (TOR) and spine switches, the imperative has shifted to moving east-west traffic most efficiently, which requires more bandwidth, more ports and lower latency. As a result, the feature requirements in hardware and software in leaf and spine switches have been reduced to a simpler set. This has made open networking in switches easier to implement and deploy.
However, the smarts of networking did not disappear they just moved to the server, where such smarts are implemented using the virtual switch preferably an open one such as OVS and other Linux networking features like IP tables. Many new networking features related to network security and load balancing have been added to OVS.
OpenStack, as an open source and centralized cloud orchestration tool, has rapidly come to prominence, with more than 60% of OpenStack networking deployed today using OVS (with OpenStack Neutron). Server-based open networking has evolved relatively quietly compared to open networking in switches, but it has made major contributions toward bringing deployment efficiencies and flexibility.
Today, in many high growth cloud, SDN and NFV applications, server-based open networking is running into server sprawl and related TCO challenges. As the networking bandwidths increase and the number of VMs proliferates on servers, OVS processing is taking up an increasingly large number of CPU cycles, which is limiting the number of CPU cycles available for processing applications and VMs.
Data centre operators cannot economically scale their server-based networking using traditional software-based virtual switches. So implementing server-based networking in x86 architectures and software is a double whammy: it increases costs as too many CPU cores are consumed, and it lowers performance as applications are starved for resources.
Offloading network processing to networking hardware is an option that has worked well in the past. However, software-defined and open source networking is evolving at a rapid pace; such innovation stops the moment data centre operators look to inflexible networking hardware for performance and scale.
Figure 2: Networking smarts moving to servers
The solution to this challenge is to offload OVS processing to an SmartNIC. A SmartNIC handles I/O functions and incorporates a programmable network processor that can run OVS and other software. With a SmartNIC handling OVS processing, performance is boosted by up to 5X, and the data centre operator frees as many as 11 CPU cores from network-related processing, enabling greater VM scalability and lower costs. Because it is programmable, a SmartNIC can evolve rapidly with few features, preserving the pace of innovation.
Although server-based networking by itself can cause server sprawl, SmartNICs are making the case for efficient and flexible open networking from the COTS server side.
Figure 3: A SmartNIC offloads networking from servers
See more here:
The central role of the server in open networking - Cloud Tech
Federal Cloud Computing – TechTarget
The following is an excerpt from Federal Cloud Computing by author Matthew Metheny and published by Syngress. This section from chapter three explores open source software in the federal government.
Open source software (OSS) and cloud computing are distinctly different concepts that have independently grown in use, both in the public and private sectors, but have each faced adoption challenges by federal agencies. Both OSS and cloud computing individually offer potential benefits for federal agencies to improve their efficiency, agility, and innovation, by enabling them to be more responsive to new or changing requirements in their missions and business operations. OSS improves the way the federal government develops and also distributes software and provides an opportunity to reduce costs through the reuse of existing source code, whereas cloud computing improves the utilization of resources and enables a faster service delivery.
In this chapter, issues faced by OSS in the federal government will be discussed, in addition to the relationship of the federal government's adoption of cloud computing technologies. However, this chapter does not present a differentiation of OSS from proprietary software, rather focuses on highlighting the importance of the federal government's experience with OSS in the adoption of cloud computing.
Over the years, the private sector has encouraged the federal government to consider OSS by making a case that it offers an acceptable alternative to proprietary commercial off-the-shelf (COTS) software. Regardless of the potential cost-saving benefits of OSS, federal agencies have historically approached it with cautious interest. Although, there are other potential issues in transitioning from an existing proprietary software, beyond cost. These issues include, a limited in-house skillset for OSS developers within the federal workforce, a lack of knowledge regarding procurement or licensing, and the misinterpretation of acquisition and security policies and guidance. Although some of the challenges and concerns have limited or slowed a broader-scale adoption of OSS, federal agencies have become more familiar with OSS and the marketplace expansion of available products and services, having made considerations for OSS as a viable alternative to enterprise-wide COTS software. This renewed shift to move toward OSS is also being driven by initiatives such as the 18F and the US Digital Service, and the publication of the guidance such as the Digital Services Playbook, which urges federal agencies to "consider using open source, cloud based, and commodity solutions across the technology stack".
Interoperability, portability, and security standards have already been identified as critical barriers for cloud adoption within the federal government. OSS facilitates overcoming standards obstacles through the development and implementation of open standards. OSS communities support standards development through the "shared" development and industry implementation of open standards. In some instances, the federal government's experience with standards development has enabled the acceptance and use of open standards-based, open source technologies and platforms.
The federal government's use of OSS has its beginning in the 1990s. During this period, OSS was used primarily within the research and scientific community where collaboration and information sharing was a cultural norm. However, it was not until 2000 that federal agencies began to seriously consider the use of OSS as a model for accelerating innovation within the federal government. As illustrated in Fig. 3.1, the federal government has developed a list of OSS-related studies, policies, and guidelines that have formed the basis for the policy framework that has guided the adoption of OSS. This framework tackles critical issues that have inhibited the federal government from attaining the full benefits offered by OSS. Although gaps still exist in specific guidelines relating to the evaluation, contribution, and sharing of OSS, the policy framework serves as a foundation for guiding federal agencies in the use of OSS. In this section, we will explore the policy framework with the objective of describing how the current policy framework has led to the broader use of OSS across the federal government, and more importantly how this framework has enabled the federal government's adoption of cloud computing by overcoming the challenges with acquisition and security that will be discussed in detail in the next section.
The President's Information Technology Advisory Committee (PITAC), which examined OSS, was given the goal of:
The PITAC published a report concluding that the use of the open source development model (also known as the Bazaar model) was a viable strategy for producing high-quality software through a mixture of public, private, and academic partnerships. In addition, as presented in Table 3.1, the report also highlighted several advantages and challenges. Some of these key issues have been at the forefront of the federal government's adoption of OSS.
Over the years since the PITAC report, the federal government has gained significant experience in both sponsoring and contributing to OSS projects. For example, one of the most widely recognized contributions by the federal government specifically related to security is the Security Enhanced Linux (SELinux) project. The SELinux project focused on improving the Linux kernel through the development of a reference implementation of the Flask security architecture for flexible mandatory access control (MAC). In 2000, the National Security Agency (NSA) made the SELinux available to the Linux community under the terms of the GNU's Not Unix (GNU) General Public License (GPL).
Starting in 2001, the MITRE Corporation, for the US Department of Defense (DoD), published a report42 that built a business case for the DoD's use of OSS. The business case discussed both the benefits and risks for considering OSS. In MITRE's conclusion, OSS offered significant benefits to the federal government, such as improved interoperability, increased support for open standards and quality, lower costs, and agility through reduced development time. In addition, MITRE highlighted issues and risks, recommending any consideration of OSS should be carefully reviewed.
Shortly after the MITRE report, the federal government began to establish specific policies and guidance to help clarify issues around OSS. The DoD Chief Information Officer (CIO) published the Department's first official DoD-wide memorandum to reiterate existing policy and to provide clarifying guidance on the acquisition, development, and the use of OSS within the DoD community. Soon after the DoD policy, the Office of Management and Budget (OMB) established a memorandum to provide government-wide policy regarding acquisition and licensing issues.
Since 2003, there were multiple misconceptions, specifically within the DoD, regarding the use of OSS. Therefore, in 2007, the US Department of the Navy (DON) CIO released a memorandum that clarified the classification of OSS and directed the Department to identify areas where OSS can be used within the DON's IT portfolio. This was followed by another DoD-wide memorandum in 2009, which provided DoD-wide guidance and clarified the use and development of OSS, including explaining the potential advantages of the DoD reducing the development time for new software, anticipating threats, and response to continual changes in requirements.
In 2009, OMB released the Open Government Directive, which required federal agencies to develop and publish an Open Government Plan on their websites. The Open Government Plan provided a description on how federal agencies would improve transparency and integrate public participation and collaboration. As an example response to the directive support for openness, the National Aeronautics and Space Administration (NASA), in furtherance of its Open Government Plan, released the "open. NASA" site that was built completely using OSS, such as the LAMP stack and the WordPress content management system (CMS).
On May 23, 2012, the White House released the Digital Government Strategy that complements other initiatives and established principles for transforming the federal government. More specifically, the strategy outlined the need for a "Shared Platform" approach. In this approach, the federal government would need to leverage "sharing" of resources such as the "use of open source technologies that enable more sharing of data and make content more accessible".
The Second Open Government Action Plan established an action to develop an OSS policy to improve access by federal agencies to custom software to "fuel innovation, lower costs, and benefit the public". In August 2016, the White House published the Federal Source Code Policy, which is consistent with the "Shared Platform" approach in the Digital Government's Strategy, by requiring federal agencies make available custom code as OSS. Further, the policy also made "custom-developed code available for Government-wide reuse and make their code inventories discoverable at https://www.code.gov ('Code.gov')".
In this section, we discussed key milestones that have impacted the federal government's cultural acceptance of OSS. It also discussed the current policy framework that has been developed through a series of policies and guidelines to support federal agencies in the adoption of OSS and the establishment of processes and policies to encourage and support the development of OSS. The remainder of this chapter will examine the key issues that have impacted OSS adoption and briefly examine the role of OSS in the adoption of cloud computing within the federal government.
About the author:
Matthew Metheny, PMP, CISSP, CAP, CISA, CSSLP, CRISC, CCSK, is an information security executive and professional with twenty years of experience in the areas of finance management, information technology, information security, risk management, compliance programs, security operations and capabilities, secure software development, security assessment and auditing, security architectures, information security policies/processes, incident response and forensics, and application security and penetration testing. He currently is the Chief Information Security Officer and Director of Cyber Security Operations at the Court Services and Offender Supervision Agency (CSOSA), and is responsible for managing CSOSA's enterprise-wide information security and risk management program, and cyber security operations.
Read more here:
Federal Cloud Computing - TechTarget
Four in five firms say they avoid running sensitive data in the public cloud – Cloud Tech
As the true definition of hybrid cloud continues to blur, eight out of 10 enterprises in a survey from Stratoscale say they avoid running sensitive data in the public cloud.
The results appear in the companys Hybrid Cloud Survey, which was conducted in June and features more than 600 responses from organisations at varying sizes.
According to the research, almost four in five (77%) respondents define the value of the hybrid cloud in one of two ways, depending on how far advanced their adoption is. In enterprises with hybrid cloud adoption level below 20%, hybrid is most frequently defined as the ability to move workloads between private and public cloud, but as it exceeds 20%, the concept moves to one of different workloads belonging in different public and private environments.
More than 80% of enterprises polled said they had either moderate or high levels of concern around public cloud lock-in, with the smallest companies those with less than 100 employees the most concerned. More than half of enterprises have also moved less than one fifth of their workloads to the public cloud, with smaller firms the furthest into their journey, with large companies having more users who start but a much slower pace of progress.
The transformation to a digital business by implementing cloud services and platforms is no longer much of an option its an imperative for the continued survival of any enterprise, said Ariel Maislos, Stratoscale CEO in a statement. The findings from our survey confirm what were hearing from our customers although many have started their journey to the public cloud, the vast majority of companies are still running mission critical workloads and sensitive data in private solutions, primarily for security reasons.
Its clear the hybrid cloud model represents the near and long-term future for most enterprises, regardless of size, Maislos added.
The lack of consensus around the definition of hybrid cloud is reminiscent of a study conducted by Clutch at the start of this year regarding DevOps, where not even a quarter of respondents could agree on a definitive meaning for the term.
Read more: What is the point of hybrid cloud - or, is it time to re-evaluate hybrid?
See the rest here:
Four in five firms say they avoid running sensitive data in the public cloud - Cloud Tech