Category Archives: Cloud Servers
Dell, HPE, IBM And Lenovo Face Competition From Cloud-Based Supercomputing – Forbes
HPE recently completed its acquisition of Cray, after doing the same with SGI just three years ago. Due to consolidation, top merchant supercomputer vendors (as opposed to government entities building their own) now include server OEMs Dell, HPE and IBM based in North America, Fujitsu in Japan and Atos Bull in Europe. Inspur, Lenovo and Sugon lead a growing group of supercomputer-focused OEM server vendors in China.
Cloud Competition
The challenge for all these supercomputer vendors is that public cloud vendors are also targeting high-performance computing (HPC) and supercomputing markets. Public cloud providers are changing the markets demand for functionality. This competition will challenge branded server OEMs ability to push upmarket toward higher-end customers with traditional on prem HPC server clusters.
Alibaba Cloud, AWS and Azure are already deploying HPC and supercomputing worthy infrastructure and services. I expect public cloud giants HPC and supercomputing focused deployments will improve continuously over time. As predictions go thats fairly tameits what they do.
Therefore, I believe it will be increasingly hard for HPC infrastructure vendors to sell clusters directly to end customers as customers opt to simply configure a supercomputer out of available public cloud instance types and networking options.
Public vs. Private Infrastructure
All the same arguments heard for the past decade about private infrastructure vs. public cloud are now surfacing in HPC market positioning.
Set aside all of the tired tropes about security, availability, latency, etc. Public clouds provide as good or better infrastructure service and support than most IT departments can manage on their own, and they have been doing that for years.
For public cloud-based HPC and supercomputing services to be successful, they cannot:
Alibaba Cloud, AWS and Azure all have recently deployed new HPC and supercomputing instance types and sizes implementing fast Ethernet networking (Azure also offers high-end InfiniBand networking) and shared-memory clustering capabilities that enable customers to meet both of the above requirements.
GCP asks HPC customers to use Preemptible instance types, which requires refactoring of existing HPC applications. In addition, Preemptible instance types have many other restrictions, including lack of any Service Level Agreements (SLAs). GCPs Cloud TPU Pods cannot be programmed with traditional supercomputing software development tools.
Data Gravity
Another important point of tension between using private and public infrastructure is data gravity.
Data gravity is somewhat similar to real gravity. The more massive a celestial object is (planet, star, galaxy or whatever), the more it influences objects in its vicinity. From a spaceflight perspective, getting spaceships out of Earths gravity well is very expensive. Its another order of magnitude of expensive to send spacecraft out of the Suns gravity well; humanity has only done that twice (Voyagers 1 and 2).
Data gravity urges IT customers to include data transfer and storage costs when considering the total costs of migrating applications from on prem infrastructure to public cloud infrastructure. In practical use, the key data gravity consideration for most applications is simple: Does an application send a lot of data back out of the cloud?
Sending a lot of data into a public cloud may take time, but most clouds do not charge data ingress fees. Moving data within a public cloud can run up data transfer charges. Sending a lot of data out of a cloud most certainly will run up data transfer expenses.
A different kind of data gravity is defined by government and military security. While public clouds are more than competitive with private infrastructure for commercial-grade security, there are datasets that must be air gapped (not connected to the public internet) and must not leave the facility or organization that created the dataset.
Customers using public cloud HPC resources most likely do not want to:
The combination of shared-memory architecture and data gravity points to a several areas that public cloud-based HPC and supercomputing clusters have distinct advantages in the short term:
Market Disruption
For the most part, disruptive technological change starts at the low-end of markets (despite a handful of Apple and Tesla counter-examples). Public cloud HPC and supercomputing capabilities will further disrupt on prem server and data center infrastructure sales. Its just a matter of how quickly this happens.
The author is an employee of Liftr Insights. The author and Liftr Insights may, from time to time, engage in business transactions involving the companies and/or the products mentioned in this post. The author has not made an investment in any company mentioned in this post. The views expressed in this post are solely those of the author and do not represent the views or opinions of any entity with which the author may be affiliated.
Follow this link:
Dell, HPE, IBM And Lenovo Face Competition From Cloud-Based Supercomputing - Forbes
Save over 90% on 5TB of cloud storage with this Black Friday deal – Boing Boing
Many of us rely on a single hard drive to store precious files. This strategy is risky, but many alternative backup solutions are pretty expensive. Polar Cloud Backup breaks the mold, providing secure storage at a price that anyone can afford. The service runs on reliable Amazon architecture and gives users total control over their data.
Available to download on PC and Mac, Polar Backup allows users to choose which files and folders to upload. All file types are supported, including videos, images, audio, documents, and more. The apps also have a scheduling feature, which ensures that the backup wont disrupt your Netflix movie.
Polar Backup provides enough space for thousands of files, and the storage is both secure and private. The data is protected by military-grade AES-256 encryption, while the servers meet with GDPR regulations.
Along with files on your PC or Mac, Polar Backup covers any connected external drives. Even if you delete files locally, they wont be removed from cloud storage.
Ahead of Black Friday, the Polar Backup lifetime 5TB plan is just $69.99. You will struggle to find a cloud storage provider that offers more for this price.
Don't wait for Black Fridayyou can get these top-sellers at deep discounts today!
In a decision released late Tuesday night, a federal judge ruled that up to 29 million Facebook users whose personal info was stolen in a September 2018 data breach are not entitled to sue Facebook as a group for damages but the users may be entitled to demand better personal data security at Facebook.
China-based technology company ByteDance is on a charm offensive, reports Reuters, ramping up efforts to distance its popular social app TikTok from the rest of its Chinese operations.
Earlier this month while I was in San Francisco, I went over to the Y Combinator incubator to record a podcast (MP3); we talked for more than an hour about the history of Adversarial Interoperability and what its role was in creating Silicon Valley and the tech sector and how monopolization now threatens adversarial interop []
Whether were on a long commute or marathon workout, the right soundtrack makes all the difference. And with all the thought we put into our playlists, the kind of earbuds we play it on is worth at least as much consideration. When it comes to sound, wearability, and durability, there are few models that hit []
If you havent switched to wireless charging for your devices yet, its a pretty good bet that youre not using more than one device. Juicing up your iPhone doesnt require much planning, but once you add an Apple Watch or pair of AirPods into the mix, youve got a logistics problem. If youre in that []
Ask any creative professional, and chances are theyll tell you Adobes Creative Cloud programs are essential to their livelihood. Still, theyre only as good as the person using them. Luckily, theres now a training that covers all the best elements of this software suite, and its the Complete 2020 Adobe CC Certification Bundle. As you []
More here:
Save over 90% on 5TB of cloud storage with this Black Friday deal - Boing Boing
Kubernetes Is the Future of Computing. Everything You Should Know. – Barron’s
Text size
Nearly all major technology companies are saying the same thing. Kubernetes is the next big thing in computing.
The Greek word for helmsman or pilot, Kubernetes is accelerating the transition away for legacy client-server technology by making cloud-native software development easier, better and faster.
Last week, more than 12,000 developers and executives gathered in San Diego at the largest annual Kubernetes conference called KubeCon. Thats up from just 550 attendees four years ago. The conference goers are all looking for ways to take advantage of Kubernetes and its ability to automatically deploy, manage, and scale software workloads in the cloud.
To understand the trend, lets start with the changing dynamics of software in the cloud. Cloud apps increasingly run in aptly-named containers. The containers hold an application, its settings, and other related instructions. The trick is that these containers arent tied down to one piece of hardware and can run nearly anywhereacross different servers and clouds. Its how Google manages to scale Gmail and Google Maps across a billion-plus users.
Alphabets (ticker: GOOGL) Google long ago developed software called Borg to orchestrate its in-house containersspinning them up and down as needed. In 2014, the search giant opted to make a version of Borg open source, calling it Kubernetes. Today, the major cloud providers all offer a Kubernetes option to customers.
Aparna Sinha, the director of product for Kubernetes at Google, notes that Kubernetes is built by the same team that created Borg. We are quite confident in its ability and how it enables applications to run more reliably, more efficiently, and more affordably, Sinha says. Kubernetes has really taken off.
Gartner says more than 75% of global companies will run containerized applications by 2022, from less than 30% today. Kubernetes has become the de facto standard for these managing containers.
As enterprises modernize their infrastructure and adopt a hybrid multicloud strategy, we see Kubernetes and containers rapidly emerging as the standard, Jason McGee, chief technology officer of IBM Cloud Platform, told Barrons in an email.
In terms of who will thrive in the shift to Kubernetes, there are some early leaders. Last month, Microsoft (MSFT) Azure Chief Technology Officer Mark Russinovich told Barrons he thinks Microsofts Kubernetes service is best-of-breed.
Some industry analysts are pointing to other companies. When asked for the Kubernetes vendors that came up the most during discussions with customers, Gartner analyst Arun Chandrasekaran listed Amazon Web Services (AMZN), Google Cloud, and IBM (IBM) Red Hat OpenShift. For on-premise companies looking to use multiple clouds, IDC analyst Gary Chen added, Red Hat right now is the leader in Kubernetes software. They have the early lead.
It is still early in this new big trend. One thing is for sure, get ready to hear a lot more from technology companies on their Kubernetes strategies. The race is on.
Write to Tae Kim at tae.kim@barrons.com
View original post here:
Kubernetes Is the Future of Computing. Everything You Should Know. - Barron's
The Risk Of Complexity And How To Fix It – Forbes
Starting on March 22, 2019, Capital One bank was the victim of one of the largest data breaches in history. A former employee of Amazon Web Services used her knowledge to bypass security and download credit applications of approximately 100 million people. She was eventually caught, but its still unclear whether the information from those applications was sold or otherwise made available to hackers.
Getty
The breach happened because the hacker was able to take advantage of a misconfigured web application firewall on an AWS cloud application used by Capital One. That was just one firewall out of potentially hundreds. Youll note that there isnt a specific number of firewalls because the number changes as cloud servers are spun up or taken down, or as the banks network is changed.
Because firewalls (a firewall is a device that restricts access to the network to authorized people or devices) are frequently added to an enterprise when specific portions of the network are initially set up, they may not have any commonality. Each firewall will be one of only a few in the company.
The problem that most companies have, especially big companies, is that a lot of this stuff was put in piecemeal, but not with any strategy
This lack of standardization means that each one has to be configured individually in what can only be described as an intensely manual process. And even one mistake can make the firewall ineffective, especially to someone who knows the details of how it works.
Getty
The problem goes beyond firewalls. In many cases a companys servers need manual configuration, so do other security and network appliances. The number of such devices can reach the hundreds, and with cloud accounts, it can reach into the thousands. You can see how managing all of this, even for a properly staffed IT department, can be overwhelming.
Getting a Handle
The problem that most companies have, especially big companies, is that a lot of this stuff was put in piecemeal, but not with any strategy, said Jack Gold, principal analyst at J. Gold Associates. They had an ad hoc process.
We need to retire some of those tasks more quickly, said Tim Woods, VP technology alliances for Firemon. He said that these overly complex, manual processes are making it hard for companies to get a handle on security.
Theyre running away from security, Woods said. He said that Capital One tried to make their changes manually, which ultimately led to the breach. He said that a better way is to automate firewall management.
Woods said that doing all of this manual work also wastes resources. They have their best people doing mundane routine tasks, he said. By automating tasks such as firewall management, We believe you may be able to reap as mu h as 40 percent reduction in repetitive work cycles.
How to Automate
A company needs to look for that low-hanging fruit, Woods said. If I can automate those low-level tasks, itll make a big difference, Wood said.
Diagram of Automation
Woods also said that preparing to automate your configuration process, gives a good opportunity to evaluate your processes and to validate your expectations of what youre trying to do.
Setting sensible requirements is an important first step in automating your firewall management. The goal needs to be to automate those activities that require the most time and are done the most frequently. Activities that occur only occasionally and require little time arent good targets for automation.
To determine what the good targets are, you need to study the workflow of your IT staff in regards to firewall, server and other types of configuration. Find out where they spend the most time, and what tasks require tedious steps, and start with those tasks, because those are the low-hanging fruit Woods mentioned.
Once youve determined your beginning tasks, its time to look for an automation provider. Firemon is one of those. So is Tufin and other companies.
As youre implementing the automation platform, you will also need to determine your larger goals. Are you primarily focused on overall security? Compliance? To some extent your goals will dictate your process.
Or Maybe Outsource
Jack Gold isnt so sure that automating your firewall management is necessarily the best answer. Instead, he suggests looking at ways to ease the management effort for starters.
What they should be thinking is whether can they standardize on something. He said.
Its much easier to build out an orchestration platform of some sort if you get all of them pretty much the same.
Getty
Another possibility is that companies are outsourcing, Gold said. They do networks as a service (NaaS). They just pay a monthly fee. Theres a high value in preventing security breaches and data loss, but theres not a high value in managing all of this. Managing those devices is overhead.
The complexity has always been there and probably will always be there, Gold explained. Networking as a service is getting some legs because companies dont want to deal with it. Also, IT doesnt have the resources to deal with it.
He said that its important to ask whether there are better places where you can put those resources.
Gold pointed out that the nature of networking is changing with the nature of IT. In the past companies felt they had to control their destinies themselves, he said. Thats changing with the cloud-based services.
Getty
Gold explained that the growth of SDN (software defined networks) means that you need to consider whether you should even be running your own networking.
Things are moving to SDN, Gold said. New gear needs to be SDN capable. NFV (network function virtualization) needs to be supported.
NFV, in which the basic functions of networking exist in software, is critical to the operation of virtualization and thus to cloud computing.
Outsourcing your networking and network operations lets your focus your efforts on the activities of your business, and not on hiring even more staff to just run the network and its related security requirements.
Major network companies, including Cisco and IBM offer NaaS, and that part of their business is growing fast. It may be that outsourcing your network will be the real solution to the overwhelming complexity of managing configuration and operation of your network.
View post:
The Risk Of Complexity And How To Fix It - Forbes
Who is responsible for cloud security? It’s a bit foggy, finds McAfee – Verdict
UK businesses are steaming ahead to become cloud-only companies, but establishing who is responsible for cloud security in an organisation is struggling to keep pace.
Research by cybersecurity firm McAfee found that 40% of large UK businesses expect to be cloud-only by 2021, with 70% expecting to be cloud-only at some point in the future.
Yet the survey of over 2,000 senior IT staff and employees in the UK, France and Germany found a lack of consensus as to who in the business is ultimately responsible for cloud security.
14% said the CEO should take responsibility, while 19% believe it should be the chief information officer. Just 5% said the chief information security officer is responsible for cloud security.
The role of IT manager drew the largest number of votes, with 34% believing them ultimately responsible for cloud security.
The findings echo those of a recent Big Data LDN survey, which found data responsibility to be spread thinly across the c-suite.
What scares me about this is that the answers are, dare I say it, sort of all over the place, said Nigel Hawthorn, EMEA director of cloud security business at McAfee, speaking at a media roundtable.
And I think this is why cloud security is not necessarily being addressed in a holistic manner, because it has to have an owner and has to have a team who are led by someone to actually make sure that its being addressed.
Hawthorne said that so-called shared responsibly models put forward by Microsoft and Amazon the two largest cloud vendors are not enough.
From the magazine: Mending leaky buckets: Overcoming the unsecured cloud server crisis
Drawing parallels with renting a car, he points out how manufacturers are responsible for safety features such as airbags, the rental firm responsible for oil and the driver for driving safely.
Theres no point in saying its your fault Ford when I drove the car at 100 miles an hour into a wall, he said.
Get the Verdict morning email
While 84% said the cloud improved their organisations data security, cloud computing provides a unique set of security problems.
Data repositories containing sensitive business or customer information can be misconfigured by businesses, providing easy pickings for cybercriminals.
Previous research conducted by McAfee found that 99% of misconfigured cloud servers go undetected.
You can outsource the work, but you cant outsource the risk, said Raj Samani, chief scientist and McAfee fellow.And the reality is [that] in cloud computing, we see organisations and people migrating and outsourcing over to cloud services with the belief that it absolutely absolves them of any risk or any concerns.
So whats the solution? Hawthorn and Samani believe that educating users at the right time in the right context about cloud security can help. But ultimately, an organisation needs to decide who is responsible for cloud security, give them adequate resources and allow their voice to be heard by the board.
I think were in a dangerous place if were going to cloud as fast as possible, but we havent decided whos responsible for the security, added Hawthorn.
Read more: Wed change AWS S3 bucket security if we had a time machine: AWS director
See the original post here:
Who is responsible for cloud security? It's a bit foggy, finds McAfee - Verdict
The 20 top tech skills that employers want and that can help you find a job, according to recruiting site Indeed – Business Insider
If you're trying to break into tech, learning certain new skills will help push you forward.
The job search site Indeed released a report this month about the top tech skills of 2019 based on job descriptions that are being posted.
Andrew Flowers, an economist at Indeed, says that in today's job market, there are two major trends that drive the top skills in tech. The first is the rise of data science, machine learning, and artificial intelligence. The second is the rise of cloud computing.
Some languages, like Java and C++, still remain some of the most important skills in tech, but they have been around for decades. On the other hand, the programming language Python is a relative newcomer and today is one of the top skills on the list.
"Python has had explosive growth," Flowers told Business Insider. "If I'm around the dinner table and a nephew asks what should I learn? Having done this report, I would say, learn Python."
The reason why Python has exploded is because of the rise of data science and machine learning, Flowers said.
"Some tools are good and will always be around," Flowers said. "Python is just so notable because it's easy to learn. It's used in data science and it's also used in web development. Its explosive growth is the #1 takeaway from this research."
Besides that, the report also shows growth in cloud technical skills for Amazon Web Services and Microsoft Azure.
"AWS is still the dominant cloud computing tool but Azure is growing really fast, too," Flowers said.
Here are the top 20 tech skills, according to Indeed:
Read the rest here:
The 20 top tech skills that employers want and that can help you find a job, according to recruiting site Indeed - Business Insider
Microsoft confirms Windows 10 2004 is the next OS version – WindowsReport.com
Microsoft has recently rolled out a 19033 preview build for the Windows 10 20H1 Update. That update is available for both Slow and Fast ring Windows Insiders. The most interesting thing about the 19033 preview build, however, is that it shows users the next version will be Windows 10 2004.
It was previously expected that the upcoming Windows 10 version would be version 2003 as previous spring releases ended with the 03 marker. However, the 19033 preview build shows that the 20H1 Update will be for Windows 10 2004. Microsoft confirmed that on the updates blog post by stating, We have chosen to use 2004 as the version to eliminate confusion with any past product names (such as Windows Server 2003).
Microsoft also appears to be a little ahead of schedule with the 20H1 Update. It has already been leaked that the big M has finalized features for Windows 10 2004. Thus, Microsoft has now released the 19033 preview build for Slow ring users as well as those in the Fast ring. It has been speculated that the big M might move the 20H1 Update to the RTM stage during December.
The 19033 preview build includes nothing new in the way in features. Instead, it provides a series of fixes. Furthermore, it does not include any watermark on the desktops bottom right corner. The absence of a watermark further highlights that Microsoft is very nearly finished with the 20H1 Update.
All this means that Microsoft might release the 20H1 Update a little earlier than usual during the first half of 2020. The big M rolled out Windows 10s 2019 spring update from May, which was a little later than the more typical April release month. However, with the 20H1 Update seemingly ahead of schedule, Windows 10 2004 might feasibly become available in March or early April 2020 at least.
Windows 10 2004 incorporates some interesting new features. Automatic Bluetooth pairing will come in handy for users who utilize Bluetooth devices with their laptops and desktops. Windows 10 2004 will be able to automatically detect Bluetooth devices and initiate the pairing process without any user intervention required.
Windows 10 2004 will include a new Cloud download option for resetting the platform. The Reset this PC window includes the Cloud download option shown directly below. That enables users to download an image copy of the OS directly from Microsoft cloud servers when reinstalling Windows.
The Windows Update tab in Settings will also include a new View optional updates option. Clicking View optional updates opens a list of optional updates for things like drivers. Then users can update Windows 10 by selecting some of the optional updates.
Chromium Edge is also nearing completion. Microsoft will probably release its revamped flagship browser before the 20H1 Update. Thus, Windows 10 2004 might also include Chromium Edge.
Users can now try out Windows 10 2004 in the Slow ring by joining the Windows Insider Program. It looks like that version will at least include more new features than Win 10 1909 that Microsoft released in November 2019.
RELATED ARTICLES TO CHECK OUT:
Go here to read the rest:
Microsoft confirms Windows 10 2004 is the next OS version - WindowsReport.com
3 Key Takeaways From VMware’s Third Quarter – The Motley Fool
VMware (NYSE:VMW) reported third-quarter earnings after the market closed on Tuesday, and the virtual-machine software specialist delivered sales and earnings results that came in significantly ahead of the market's expectations -- at first glance. Revenue for the period rose roughly 12% year over year to come in at $2.46 billion, ahead of the average analyst sales target of $2.41 billion.Non-GAAP (adjusted) earnings per diluted share came in at $1.49, which was down roughly 4.5% year over year but topped the average analyst estimate's call for $1.43.
Analysts including those at KeyBanc raised their targets on the stock from $175 to $182, and an analyst at Citi raised its price target from $183 to $198.
VMware stock posted gains shortly following the release and analyst price hikes. But these gains quickly reversed, and the stock closed down about 2.4% in the day of trading on Wednesday following the release. The market appears to have realized that the analyst targets that VMware beat were only for the core business -- and had not factored in the contribution from the company's Carbon Black acquisition. Investors may have also reassessed the company's guidance.
Here's a deeper look at three points from the company's third-quarter results, outlook, and earnings call.
Image source: Getty Images.
VMware's Carbon Black acquisition closed on Oct. 8 and added $10 million in unexpected revenue to the company's third-quarter sales total. Backing out Carbon Black's sales contribution, VMware actually fell about $5 million short of the analysts' third-quarter sales target.
Carbon Black is a cloud-native endpoint protection specialist that VMware acquired in a $2.1 billion deal that was announced in August. VMware also initiated its acquisition of cloud-software developerPivotal for$2.7 billion in August, and that deal is expected to close before the year is out. These moves are set to play a big role in accelerating the company's move away from on-premise solutions and into cloud-based services.
Here's CEO Patrick Gelsinger discussing the Carbon Black integration during the earnings call:
Carbon Black, together with the security-driven value-add from our networking with micro-segmentation, end-user computing, cloud and compute offerings, in aggregate represents approximately $1 billion of business for us this year. With this acquisition, we launched a new security business unit, including Carbon Black and our AppDefense offerings under the leadership of former Carbon Black CEO, Patrick Morley, as general manager. Since the close of the acquisition, we announced multiple new Carbon Black Cloud solutions and an enhanced partnership with Dell making Carbon Black Cloud the preferred endpoint security solution for Dell commercial customers. We believe the combination of Carbon Black and VMware will bring a fundamentally new paradigm to the security industry.
Even with a relatively small contribution from Carbon Black in the quarter, hybrid cloud and software-as-a-service (SaaS) revenue climbed 40% year over year to account for 13% of overall revenue.Total services revenue grew roughly 10% year over year to $1.48 billion.
Building out its cloud-delivery and cybersecurity offerings isn't the only big initiative taking place at VMware. The company is also working to build its position in Kubernetes. The open-source container system was originally designed by Google and appears to be on track to become the new standard for a range of server virtualization and container management applications.
Kubernetes is a platform for sharing and accessing private applications and workloads, but it differs from traditional virtual machines in that an underlying operating system can easily be shared between these apps. This makes shared software systems simpler to develop, launch, and update, in addition to a range of other versatility and performance advantages. These characteristics make it especially well suited for projects using hybrid-cloud architecture -- and an important product competency for VMware.
The company unveiled VMware Tanzu (a suite of programs to help enterprises manage applications on Kubernetes) during the third quarter, and it's in the process of shifting its vSphere server virtualization software to the open-source platform.
"This is like the magic bridge between those two worlds," said Gelsinger when discussing the benefits of Kubernetes and having virtual machines and containers accessible on the same management platform. "And literally the millions of people that operate VMware environments today are becoming Kubernetes-enabled tomorrow."
Gelsinger also said that he believes that Kubernetes is the most important technology he's seen since Java or the virtual machine itself.
VMware projects that fourth-quarter license revenue will climb 13% year over year to reach $1.39 billion, and that total revenue for the period will rise 13.8% to hit $2.95 billion. Non-GAAP earnings per share for the period are expected to come in at $2.16.
Hitting those fourth-quarter targets would bring the company's full-year revenue to $10.1 billion, up 12.5%year over year, with adjusted earnings of $6.58 per share -- up from $5.19 per share last fiscal year. The company expects free cash flow for the current fiscal year to wind up at $3.57 billion after accounting for acquisitions and integration costs, up from $2.95 billion last year.
Management also issued guidance for next fiscal year, with the note that it didn't include any expected impact from the Pivotal integration. Here's Gelsinger breaking down what shareholders can expect in terms of sales growth and segment and product contributions:
Now as we look to our next fiscal year, we're expecting the strength we're currently seeing in the business to continue. Preliminarily, we're planning for a fiscal '21 total revenue growth rate in the low double digits, not including the impact from the proposed Pivotal acquisition. We also expect hybrid cloud subscription and SaaS to drive much of the future growth of the business and show a significant increase in its percentage mix of total revenue.
The company expects that investment in its hybrid-cloud subscription and SaaS businesses and the integration of Carbon Black will significantly boost operating income but hurt operating margin by up to 2 percentage points next year.
Returning to current-year targets, VMware is now trading at roughly 24.5 times expected earnings and 18.4 times projected free cash flow.
Original post:
3 Key Takeaways From VMware's Third Quarter - The Motley Fool
Data auditing is the future of data privacy – SDTimes.com
It has been over a year since the General Data Protection Regulations (GDPR) went into full effect to ensure the protection and privacy of an individuals personal data. Following several high-profile data breaches and large fines being imposed as a consequence of non-compliance, this has sent shock waves around the world regarding the need for increased data protection regulations. While, at first, the EUs right to be forgotten principle seemed out of reach, it is becoming a standard, especially as more and more Americans are beginning to question how organizations are using their data. Now with the passing of the California Consumer Privacy Act (CCPA), which will be implemented on January 1, 2020, the need for businesses to adhere to data regulations are becoming a reality. The impact this will have on organizations remains to be seen, but whats clear is that consumers around the world are starting to demand change.
And, as more regulations are put in place, and consumers begin to care more about how their personal data is being used, it is likely well see other states follow suit and pass similar data privacy laws. In fact, New York state recently proposed a privacy bill that would be far bolder than Californias, but as other states try to pass similar legislation, lawmakers say a national data privacy bill is still far away. Regardless, organizations should start looking ahead of these regulations to avoid costly fines and prevent lost business opportunities in the future.
RELATED CONTENT:The problem with dataBeware the dark patterns of privacyA new approach to personal data discovery
Navigating the changing data privacy landscape may be overwhelming at first, but its something that organizations should come to expect as data privacy and protection continue to take center stage. While headlines have been riddled with the tales of major cybersecurity breaches in the U.S., the CCPA aims to give users more control resulting in companies needing to work with new rules. Heres what businesses and DBAs and data security professionals should consider as they approach data privacy and take stock of their policies.
Anticipate upcoming regulationsReading is key to comprehending the underlying complexities of U.S. specific data regulation policies, but its also important to know how GDPR is affecting European countries as well. Remember, GDPR, though an EU compliance regulation, affects US companies that process the personal data of data subjects controllers and processors who are in the EU. Consider investing in training for your IT staff to help them gain a better understanding of how these policies are going to affect their work. Having quarterly training sessions to inform staff on the latest developments in statewide privacy policies will keep security mindedness at the forefront of their work.
Perform a complete inventory of company dataAs organizations begin to evaluate their data landscape, they should consider the following questions: What kind of data do we store, Who has access, Where is it stored, and Is it secure? This provides a good starting point for the creation of a data privacy program. Performing an in-depth data inventory (or audit), will ensure you can track personal data processing activities across your company. This is an internal audit and not the same as a compliance audit. For database DBAs, this is the toughest job, especially if youre dealing with multiple environments (on-premises and cloud), servers, virtual machines, databases, backups, etc, but it is one of the most important tasks if you want to avoid unnecessary fines in the long run. Performing a regular data audit will help promote visibility and provide DBAs with a better view of the organizational processes that might have been neglected over time.
Simplify and automate data identificationAfter performing a full data audit, consider automating the data discovery process for your databases, based on a set of rules that define what personal and sensitive data is for your company to make the future flagging process of such data more accessible. Utilizing data identification software is an elegant way to locate personal and sensitive data. Another step in data identification is to understand your companys risk regarding where data lives and who has access to it. Taking a proactive approach to data logging will eliminate a backlog and make future audits easier. In addition, this will help keep the data supply chain secure by understanding the channels the data has traveled through.
Utilize database auditingMost database vendors have database auditing utilities that can track and record where and when changes are made to data and who made the change in the event of a compliance audit. These utilities can also track the type of data change made (i.e. insert, update or delete) and generate a report that may be required for an auditor.
Leverage independent experts with in-depth knowledge of the regulationsHaving the appropriate staff in place will make all the difference in applying data policy regulation. Consider an executive whose job is to monitor all things data protection. Having a strong policy presence at the top of the organization ensures policies are put into practice correctly. If appointing a policy executive in your C-suite is beyond your budget, consider leveraging an independent expert to evaluate your organization and provide advice on upcoming policy changes (in the case of GDPR, such a person is called a Data Protection Officer or DPO). Having the appropriate people at the operational level ensures that business is taking place inside a regulatory framework, and prepares IT operations to understand how new policies will affect existing databases.
GDPR has changed the way organizations around the world look at data, and CCPA will force similar data policy changes in the U.S. that will dramatically change the ways organizations and DBAs store user data. All companies with an online presence are data companies which means all organizations need to take these regulations seriously. IT staff need to stay alert to how these changing conditions will affect the workflow for their specific sector of work. To be good citizens of the world, we must first be good students and learn from the changing policy around us. Our data depends on it.
Here is the original post:
Data auditing is the future of data privacy - SDTimes.com
Cloud Servers Market Analysis, Trends and Forecast to 2025|Dell, HP, IBM – Follow Real News
Electronics & Semiconductor
QY Research has recently published a new report, titledGlobal Cloud Servers Forecast & Opportunities 2019. The report has been put together using primary and secondary research methodologies, which offer an accurate and precise understanding of the Cloud Servers market. Analysts have used a top-down and bottom-up approach to evaluate the segments and provide a fair assessment of their impact on the global Cloud Servers market. The report offers an overview of the market, which briefly describes the market condition and the leading segments. It also mentions the top players present in the global Cloud Servers market.
The research report on the global Cloud Servers market includes a SWOT analysis and Porters five forces analysis, which help in providing the precise trajectory of the market. These market measurement tools help in identifying drivers, restraints, weaknesses, Cloud Servers market opportunities, and threats. The research report offers global market figures as well as figures for regional markets and segments therein.
The Cloud Servers research report opens with an executive summary that gives a brief overview of the market. It mentions the leading segments and the players that are expected to shape the market in the coming years. The executive summary offers a glance of the market without any bias. In the succeeding chapters, the research report on the global Cloud Servers market focuses on the drivers. It explains the changing demographic that is expected to impact demand and supply in Cloud Servers market. It delves into the regulatory reforms that are projected to shift perspectives. Additionally, researchers have discussed the very source of the demand to analyze its nature.
Get PDF template of this report:https://www.qyresearch.com/sample-form/form/1032445/global-cloud-servers-forecast-amp-opportunities
The report also sheds light on the restraints present in the global Cloud Servers market. Analysts have discussed the details highlighting the factors that are expected to hamper the growth of the market in the coming years. Evolving lifestyles, taxation policies, and purchasing powers of various economies have scrutinized in great detail. The report presents fair points about how these restraints can be turned into opportunities if assessed properly.
Cloud Servers Market Competitive Landscape
The last chapter of the research report on the global Cloud Servers market focuses on the key players and the competitive landscape present in the market. The report includes a list of strategic initiatives taken by the companies in recent years along with the ones that are expected to happen in the foreseeable future. Researchers have made a note of the financial outlook of these companies, their research and development activities, and their expansion plans for the near future. The research report on the global Cloud Servers market is a sincere attempt at giving the readers a comprehensive view of the market to the interested readers.
Cloud Servers Market Leading Players
Dell, HP, IBM, Oracle, Cisco, Fujitsu, Hitachi, NEC
Cloud Servers Market Segmentation
Through the next chapters, the research report reveals the development of the Cloud Servers market segments. Analysts have segmented the market on the basis of product, application, end-users, and geography. Each segment of the global Cloud Servers market has been studied with in-depth insight. Analysts have evaluated the changing nature of the market segments, growing investments in manufacturing activities, and product innovation that are likely to impact them. In terms of geography, the report studies the changing political environment, social upliftment, and other government initiatives that are expected to contribute to the regional markets.
Cloud Servers Segmentation by Product
Public Cloud, Private Cloud, Hybrid Cloud, Community Cloud
Cloud Servers Segmentation by Application
Application I, Application II
Questions answered in the report
Enquire for customization in Report @https://www.qyresearch.com/customize-request/form/1032445/global-cloud-servers-forecast-amp-opportunities
About Us:
QYResearch always pursuits high product quality with the belief that quality is the soul of business. Through years of effort and supports from the huge number of customer supports, QYResearch consulting group has accumulated creative design methods on many high-quality markets investigation and research team with rich experience. Today, QYResearch has become a brand of quality assurance in the consulting industry.
More here:
Cloud Servers Market Analysis, Trends and Forecast to 2025|Dell, HP, IBM - Follow Real News