Category Archives: Cloud Hosting

IBM CIO leaves for AWS and Big Blue flings sueball to stop him – The Register

IBM has flung a sueball at Jeff Smith, its former chief information officer, because he's trying to go to work for Amazon Web Services.

Big Blue filed a complaint [PDF] in a US district court in New York last week that says Smith threatens to violate his one-year non-competition agreement by going into direct competition with IBM as a senior executive of Amazon Web Services, one of IBM's main competitors in cloud computing.

The complaint also alleges Smith has already revealed some information to AWS CEO Andrew Jassy, violated directives not to retain presentations about IBM's new cloud, and then wiped his company-issued phone and tablet before leaving the IT giant, making it impossible for IBM to detect other communications with Jassy or determine if he transferred any other IBM information.

The complaint says Smith is one of only a dozen executives involved in top-level decision-making about IBM's next-generation cloud platform, has insider knowledge of IBMs security posture and was involved at the very highest level of internal discussions on IBM's transformation plan. If AWS can pick Smith's brains on any of those matters, IBM worries the cloud colossus will get an unfair advantage.

The filing says Smith's knowledge of its future cloud is critical, because those plans will help IBM evolve beyond its current status as a hosting-scale provider, making it more viable for IBM to match the cost economics of the market leaders. The filing adds that AWS is considered the market leader on cost and that IBM needs to be match it.

That document's assertion that IBM's current cloud can't go toe-to-toe with others accords with analyst firm Gartner's recent assessment that it is SMB-centric, hosting-oriented and missing many cloud IaaS capabilities required by midmarket and enterprise customers.

That may soon change, as the filing says IBM's new cloud is set to be launched in the coming year. Yet Gartner also warned its customers that history suggests Big Blue will struggle to deliver its next-gen cloud on time.

Perhaps that's why Smith was willing to go from being CIO of world-girdling IBM, a gig that's a big step up from his previous jobs in Australia. to being a mere vice-president of AWS.

New York attorney general Eric T. Schneiderman last year criticised non-compete agreements, saying that Unless an individual has highly unique skills or access to trade secrets, non-compete clauses have no place in a workers employment contract.

IBM alleges Smith has plenty of access to trade secrets, so it will be fascinating to see how the State's courts interpret the agreement.

Sponsored: The Joy and Pain of Buying IT - Have Your Say

See original here:
IBM CIO leaves for AWS and Big Blue flings sueball to stop him - The Register

How To Win at Cloud Hosting in a Microsoft-AWS-Google World – Redmond Channel Partner

News

Done right, hosting can be very profitable for partners, but it's hard to compete with the mega-vendors without taking a co-location approach.

Now that I'm out of the hosting business, competitive concerns no longer limit me from sharing my insights.

Done right, hosting can be very profitable. But the hosting landscape is changing rapidly and there are many pitfalls.

Most hosting providers run their own datacenters or they rent space for their equipment in a bigger facility (co-location). In every country where I've had hosting discussions, a key element is that the hosting provider will note that the data will stay in their country and that's why Microsoft, Amazon Web Services (AWS) or Google is less trustworthy. This is like sticking your head in the mud and hoping potential customers will not perform their due diligence.

In reality, all three vendors are working hard on data privacy and data protection. Both Microsoft and AWS are aggressively rolling out datacenters in multiple countries, so the risk is imminent that one of their datacenters will pop up within your borders. You're toast if that was your key competitive edge.

Running your own datacenters isn't sustainable in the long run. Today the cost for a local hosting provider to operate its own datacenters is less than buying services from one of the three giants, but this is something that's likely to change. The cost for running computing at a mega scale is far less than any regional player can ever compete with. I expect prices will keep falling for a few years until it becomes less profitable to run your own datacenter than to buy computing power from someone like Microsoft, AWS or Google.

To prepare, don't invest in your own datacenter facility. Instead, go with co-location, which will help your profitability short term and make it easier for you to gradually cut over to Azure or AWS.

To be able to scale up while being profitable, you need to make big bets and take an industrialized approach. At my old hosting company, Idenet, we did this by building our solutions with standardized building blocks.

It was also key to not be shy about the fact that with standardization and automation, you can reduce man hours. At Idenet we managed to grow the business without adding people, and you might even be able to reduce the number of people if you do it right. You will need great people taking care of delivering your services, but more and more basic tasks are being done through great tools.

That leads me to another key insight: Don't let your technical people go out and buy best-of-breed tools and solutions. I found it crucial to take a strategic approach where we said that our preferred vendor was Microsoft for everything. Microsoft doesn't have the best software in every single category, but that's not important. It often has solutions that are good enough, and with the company's pace of innovation, a better release will be coming soon.

The main advantage with Microsoft is that its products often work well together. If you go with best-of-breed, you end up with more people, less integration and higher costs.

If you want to survive and thrive as a hosting provider, you can't run your own datacenters in the long run, and you can't just focus on infrastructure like virtual machines. Instead, you'll need to climb the value chain and take responsibility for complex applications and workloads. That's where the magic happens! Only then will you be in a place that keeps you from becoming obsolete.

When you focus on applications and workloads, it becomes less important who actually owns the datacenter. The true margins lurk higher up in the stack.

More Columns by Per Werngren:

About the Author

Per Werngren has held many roles at the worldwide level of the International Association of Microsoft Channel Partners (IAMCP), including chairman and president.

See the article here:
How To Win at Cloud Hosting in a Microsoft-AWS-Google World - Redmond Channel Partner

As Tech Execs Rally Around Kushner, Government Cloud Adoption Still Has Ways to Go – Windows IT Pro

White House Senior Adviser Jared Kushnerheads theWhite House Office of American Innovation, which is tasked with modernizing government IT.

Zach Gibson/Getty Images

Six weeks after a group of tech executives traveled to Washington, D.C. for a June meeting with President Donald Trump and his advisers, including Jared Kushner, the presidents son-in-law and his team are starting to work with companies, including Apple and Google, to getgovernment to more effectively use technology.

According to a report by Recode, Kushner and other top advisers had a private call lastweek with major tech companies who are members of the American Technology Council, asking for input to modernize government IT. One of the ideas on the table is a system where leading tech engineers do tours of duty advising the U.S. government on some of its digital challenges, Recode says.

Though details are scarce at this point, that idea is not a new one. The U.S. Digital Service has run a similar program where it recruits top technologists for term-limited tours of duty with the Federal Government.

The American Technology Council, which was formed in May, is led by Kushners White House Office of American Innovation (WHOAI), a small team focused on bringing new thinking and real change to the countrys toughest problems, according to a report by Politico.

So far consensus around the effectiveness of WHOAI is mixed, with critics worried that Kushners split focus will mean critical projects like moving more agencies to the cloud get left behind. On the other hand, proponents praise his ability to spot problems, figure out whos already working on it, and identify-then-provide whatever help they need to do a better job an approach that doesnt cut into federal IT budgets. One of the tangible wins of WHOAI so far is fixing the VAs electronic health care system.

Just as well-known tech companies use rapid experimentation to test new approaches, government can too, using existing resources, a report by Brookings that looked at ways Kushner can modernize government said. For example, the Department of Education ran quick, virtually cost-free tests to see which email messages worked best in reaching borrowers in default on student loans. Within a few weeks, it had the answers. It used that information to help thousands of individuals shift to more manageable repayment plans.

As long as Kushner can keep persuading agency secretaries and CEOs and civil servant to get together and talk, he has a shot at making progress on some of the most intractable issues that have long stymied Washington, from federal agency mainframes to well-maintained roads and bridges, Politico said.

Arguably one of the biggest technology initiatives at the federal level that has carried over into this administration is the shift to cloud computing. Under former President Barack Obama, the government adopted a Cloud-First Initiative in 2011, where agencies were encouraged to adopt cloud-based services in lieu of expensive on-premise data centers. Along with the initiative, the government has been consolidating its data center footprint as part of its Data Center Optimization Initiative (DCOI), a move which has a cumulative savings from 2012 to 2017 of $2.2 billion. By 2018, the government hopes these savings to reach $2.7 billion.

Indeed, cost continues to be a primary driver for adopting cloud in the public sector. According to a report by MeriTalk last year, primary motivations for moving to the cloud are cost savings (46 percent), increased flexibility (42 percent) and legacy systems reaching their end of life (35 percent). This last point is particularly interesting as feds continue to spend more than 80 percent of their time and budgets on legacy system life support, according to a separate report by MeriTalk.

According to government IT services provider CSRA, there are five key roadblocks that are preventing federal cloud adoption. These are: concerns around cloud security; organizational culture and maturity; lack of readiness to adopt cloud technologies; perceived lack of control; and immaturity of federal procurement models.

To its credit, the government does acknowledge that roadblocks exist, and is slowly making headway on removing some of them. For example, a lot of the cultural barriers to cloud exist because of a lack of education. In a June report the USDS outlines its efforts in providing digital service training to help the government become a smarter buyer of technology once it establishes a specialized procurement workforce that understands the digital and IT marketplace, agile software development methodology, cloud hosting, and the DevOps practice of integrating system operations with application development teams.

If WHOAI is going to be successful at modernizing government IT, public-private partnerships are just the start. The cultural changes within the government needed to fully embrace technology could be what makes or breaks the momentum of the initiative, and is one that IT pros will be watching play out over the coming months.

More here:
As Tech Execs Rally Around Kushner, Government Cloud Adoption Still Has Ways to Go - Windows IT Pro

Heavy clouds in IT world make it rain gold for UPS box manufacturers – The Register

The growth in cloud computing services is creating a financial windfall for perhaps an unlikely source: backup battery vendors.

As the demand for larger and more reliable cloud data centers has grown, hosting companies are increasingly building up their stock of backup power supplies, and as a result, uninterruptible power supply manufacturers will profit.

According to research from IHS Markit, revenues for UPS devices are on the upswing after years of falling after their peak in 2011. While 2016 saw the market finally eke out some growth, analysts believe UPS revenues will continue to rise in 2017 before taking off in full by 2018.

2015-2018 projected revenues

"Digitization continues, with the number of internet users and internet connected devices growing rapidly, leading to increasing demand for compute," said IHS research analyst Lucas Beran.

"With colocation and cloud service providers best poised to meet those demands, 2016 may be a glimpse into what the future holds for the UPS market."

The reason for this, says IHS, is the growing demand for the UPS devices that provide power backups for cloud and colocation data centers. Many of these centers employ truly massive amounts of equipment in each location, and in most cases require uptime rates well over the 99 per cent mark to keep customers happy.

In particular, the researchers believe data center companies are taking a close look at lithium-ion-based battery units, which have fallen in price recently, making the units affordable at scale.

"With significant decreases in price in recent years, lithium-ion batteries are warranting more consideration. However, despite the recent decreases in price, lithium-ion batteries still carry a much higher total initial investment than traditional valve-regulated lead acid (VRLA) batteries," explained Beran.

"But for this higher cost, the benefits of lithium-ion including higher energy density in a smaller footprint and a longer life cycle lead to a total cost of ownership that is on par, if not less than, VRLA batteries in most instances."

As a result, analysts believe that UPS revenues will be up over $7.5bn this year and could top the $7.7bn mark in 2018.

Sponsored: The Joy and Pain of Buying IT - Have Your Say

More here:
Heavy clouds in IT world make it rain gold for UPS box manufacturers - The Register

Epic Move: UC San Diego Health Transitions to Cloud Technology – Newswise (press release)

Newswise UC San Diego Health has moved its electronic medical records (EMR) system to the cloud. The move to an Epic-hosted cloud environment is part of a long-term strategy to shift away from traditional data centers to a less expensive, more reliable and secure repository for patients medical records.

By creating greater operational efficiencies, we can invest more time and resources in patient care, said Mark Amey, associate chief information officer, UC San Diego Health. UC San Diego Health has deployed a number of strategies to allow its hospitals and clinics to be more agile and respond to demand at a rapid pace within a robust disaster recovery environment.

Cloud computing means storing and accessing data and programs over the internet in a hosted, shared environment rather than on a local server or personal computer. The cloud infrastructure is resilient, offering more uptime with redundant systems that protect data. UC San Diego Health plans to fully deploy cloud-based solutions for all data storage needs within three years.

Health systems both large and small are seeking secure and cost-effective approaches to providing EMR capabilities to their users, said Stirling Martin, Epic senior vice president. UC San Diego Health is the first academic health system to make the migration from their own self-hosted Epic infrastructure to Epics state-of-the-art cloud hosting environment.

A cloud-hosted environment helps UC San Diego Health to meet industry standards to safeguard patients protected health information. Cloud computing enables timely cybersecurity updates and patching as well as heightened security controls. Cloud computing also permits easier disaster recovery and enables hosting vendors to focus on application specific security needs.

This is our first significant milestone in moving key pieces of infrastructure into the cloud to provide always-on solutions from anywhere that can be scaled to our growing geographical print, said Adam Gold, chief technology officer, UC Irvine Health and UC San Diego Health. The cloud approach allows us to better provide innovative technology to support outstanding clinical care, research, and teaching.

The information services team has moved approximately 10,000 workstations at UC San Diego Health to this virtual delivery method, allowing users to access the electronic health record via the cloud. The team has also integrated over a hundred third-party applications that work with Epic within the new cloud environment.

UC San Diego Health is the hub for a single electronic medical records system serving UC Riverside Health and community practice affiliates, a cost-saving arrangement that improves coordination of care among physicians. UC San Diego Health will also share its EMR system with UC Irvine Health starting this November.

Additionally, Christopher Longhurst, MD, chief information officer at UC San Diego Health, is the sponsoring executive for the development of a UC Health-wide data warehouse, integrating patient data across the UCs five academic health systems, which together comprise the fourth largest health care system in California. This initiative supports medical decision making, clinical research and population health throughout the state.

In July, UC San Diego Health was named one of the nations Most Wired health systems by Hospitals & Health Networks magazine. The award recognizes hospitals and health systems that excel in using information technology to advance patient care and population health, protect the privacy and security of patient information, and bring greater efficiencies to operations.

###

Excerpt from:
Epic Move: UC San Diego Health Transitions to Cloud Technology - Newswise (press release)

Global Health wins contract to deploy cloud system at Sydney hospital – Proactive Investors Australia

Global Health () has secured a contract to deploy its MasterCare Patient Administration System (PAS) at the Arcadia Pittwater rehabilitation hospital in the Northern Beaches of Sydney.

Launching in early 2018, the 5-year contract will see MasterCare PAS hosted on Global Healths Altitude Cloud Service.

MasterCare PAS simplifies communication for a seamless flow of information from the time a patient presents at the hospital reception up to the time the patient is discharged.

The system is easy to use with customisable menus to match the hospitals work flow and has the ability to transact with health funds for insurance eligibility checks and claims.

Andy Hui, cloud solutions manager, commented: The last 18 months has seen a major uptake of cloud hosting in the healthcare industry.

People are starting to realise the significant benefits of hosting their software and data in the cloud.

Global Health is a provider of digital health solutions that address the needs of administrators, funders, clinicians and consumers across the healthcare industry.

Recently celebrating its 30 year anniversary, the company is one of Australias longest established and most experienced developers of software applications for the healthcare industry.

Global Healths core software products include MasterCare EMR, MasterCare PAS, PrimaryClinic, ReferralNet, Altitude Cloud Hosting, HotHealth and LifeCard.

Arcadia Pittwater is a new 85 bed purpose built rehabilitation hospital part of the Arcadia Health Care group.

Located in the heart of Sydneys Northern Beaches, the new hospital offers medical services to support patient recovery.

Launching in early 2018, the 5-year contract will see MasterCare PAS hosted on Global Healths Altitude Cloud Service.

MasterCare PAS is the core system for a number of other rehabilitation hospitals across Australia and continues to be recognised as one of the most comprehensive administrative systems available.

ReferralNet Secure Message Delivery is Global Healths secure message delivery platform for the exchange of confidential clinical and patient information between healthcare providers.

ReferralNet already naturally integrates with Global Healths existing clinical systems including MasterCare EMR, MasterCare PAS, PrimaryClinic and MasterCare+ for Referral Management.

Achieving interoperability with other vendors will expand Global Healths footprint in the secure messaging market.

The Australian Digital Health Agencys (ADHA) trials are helping fast-track interoperability efforts between Global Healths ReferralNet and Telstra Healths Argus.

The trials are a vital step to the healthcare industry becoming fax-free, a structural shift that will result in considerable cost savings.

The rest is here:
Global Health wins contract to deploy cloud system at Sydney hospital - Proactive Investors Australia

Project Manager – Amazon, Google, Microsoft Cloud Hosting – Dorking and Leatherhead Advertiser

Project Manager - Amazon, Google, Microsoft Cloud HostingFarnboroughCirca 35,000 + Perks + Fun Workplace

We're seeking a talented individual to Project Manage cloud hosted technology solutions serving the needs of our partners and clients - modern, creative digital agencies and their brands.

This newly created role will take responsibility for the successful oversight and delivery of all client web technology projects and consultancy Amazon, Microsoft, Google Cloud and dedicated hardware.

We're looking for a true team collaborator that:-

+ Thrives on being client facing and working across the business+ Relishes orchestrating a diverse range projects from contract to go live+ Has agency/brand side knowledge of web infrastructure and related technologies+ Knows how to manage key dependencies in the lifecycle of a project+ Is familiar with swift planning and management of commercial resources

You'll possess a winning personality, be grounded in practical common sense, and be the solid enabler with the ability to bridge communications effortlessly across Tech, Service, Clients and Suppliers.

You'll have a strong desire to progress your professional career in project management, perhaps transitioning from a software/hardware role, having found your passion for empowering tech teams to succeed.

We are rich in technology leadership to support your development, with super engaged tech founders who empower the whole organisation to stay ahead of the curve in today's digital landscape. Most importantly there's plenty of scope for a talented individual to grow in this role.

Our office overlooks the private jet runway at Farnborough airport, and we have great benefits including free parking, mid-week team lunches plus a great social environment with film nights, drone club and team nights out organised by our very own Ministry of Fun.

Why not reach out and explore with us?

As a minimum we're looking for:-

+ 2 years diverse technology project experience in a customer facing role+ Knowledge & understanding of web-related technology services+ Proven ability to build and manage flexible project frameworks free of bureaucracy+ Confident & neutral project management outlook

We are an equal opportunities employer and welcome individuals who are in possession of the appropriate requirements to work within the UK/EU.

You may have worked in the following capacities:Technology Project Manager, Managed Services Project Manager, IT Services Project Manager, Hosting Project Manager, Cloud Services Project Manager, Project Coordinator.

Interested? Just Apply Below...

In 2005 we ripped up the rule book to deliver a recruitment agency experience that makes everyone feel just that little bit happier. By applying you consent to us processing & passing your application to our client for review for this vacancy only. If your skills match the role you will hear back from us within 2 business days. Good luck, Team RR.

See more here:
Project Manager - Amazon, Google, Microsoft Cloud Hosting - Dorking and Leatherhead Advertiser

From Public Cloud to Hybrid IT Straight From the Horse’s Mouth – InfoWorld

By Gary Thome, VP and Chief Engineer, HPE Software-Defined and Cloud Group

People who bet on horse races know that the most reliable information about a horse will come from those who are closest to that horse -- a trainer or someone working in the stable. The idiom, straight from the horses mouth, implies that someone has gotten even better information a tip from the horse itself!

Over the past two months, Ive been writing articles about common public cloud concerns and why hybrid IT gives you the right mix of public cloud, private cloud, and on-premises solutions. Of course, I dont expect you to take my word for it. But maybe you will take advice from those who tried public cloud and have since moved to hybrid IT.

In this article, Ive complied a few quotes from a variety of organizations of all sizes and industries. Each has experienced public cloud and now has moved to hybrid IT. So here you go advice that is straight from the horses mouth:

Public cloud performance issues

Were moving some of our web infrastructure workloads back to our own data center. Performance was the big issue. Lack of communication from our public cloud provider led to a customer-facing outage we could not have prevented. Now were seriously rethinking our entire cloud strategy and are moving more workloads back on-prem. Social media company

Were bringing back apps that we had put on the public cloud and didnt go as planned because of reduced performance when compared to when they were on-prem. That said, there are other apps we feel are truly more cloud-ready, and were trying those out in the cloud so we can have what we deem to be a truly hybrid solution. Financial Services company

Public cloud cost concerns

We opted to be bleeding-edge and move to cloud when storage and hypervisor license costs were on the climb. It seemed to be a better way to keep costs in an opex mode and know what they were. That worked slightly at first. We used a lot of IaaS and SaaS within AWS. As we grew, the usage did too and the bills grew far faster. We honestly reached a tipping point when billing started to approach $100,000 per month. We had good financial outcomes moving on-prem with a hybrid solution so we can access cloud when we choose. We now have two times the horsepower on-prem for half the cost. Social Media company

One of the biggest issues driving declouding for us is that, to be honest, the initial move to the public cloud was done willy-nilly. We moved the workloads and then we moved the apps to run off those workloads. But we didnt really think about how to stage properly, how to control usage costs, and how to design an exit strategy. We quickly learned that cloud costs were far higher than we expected. Retail company

Public cloud control concerns

Flexibility is important. With a private cloud, we get the control and flexibility of a dedicated environment thats tailor-made to address our specific IT needs. For fast-growing companies like ours with constantly changing requirements, a private cloud offers more flexibility to adapt and evolve as the company changes. Social Media company

The data we hosted on AWS was growing exponentially, as all data does, and that increased costs. Public cloud hosting served its purpose when we entered the all-cloud (for the most part) approach, but then you hit this point where it doesnt make financial or operational sense any longer, when the same thing can be accomplished on-prem for less money and less hassle. Having the workloads back on-site gave us better control over usage, and we could better see spikes in activity. Public Sector company

The benefits of Hybrid IT

Many businesses have already started to move beyond the public cloud into a new era of hybrid IT that combines public cloud, private cloud and traditional IT. New offerings such as hyperconverged and composable infrastructure offer cloud-like capabilities on-premises solutions that can provide businesses more control, greater performance, less cost, and less risk than many public cloud options. A combination of on-premises, software-defined options within a private cloud seamlessly combined with public cloud lets businesses build the best possible infrastructure for their individual workloads.

The lower cost of storage, servers, and even better servers that require less hypervisor licenses made it less costly than it once was to scale out a data center or private cloud on-site. Energy company

Organizations all over the world are taking a closer look at their applications and deciding which ones should be in the public cloud and which ones should remain on traditional IT or a private cloud. Although performance, cost and control issues are all relevant, you should also consider the business model of each workload in your decision.

HPE has assembled an array of resources to help you transition to a brand-new hybrid IT world. You can learn more about HPE composable infrastructure powered by Intel by reading the e-book,HPE Synergy for Dummies, or learn about HPEs approach to hybrid cloud by checking out the HPE website,Project New Hybrid IT Stack. And to find out how HPE can help you determine a workload placement strategy that meets your service level agreements, visitHPE Pointnext.

The rest is here:
From Public Cloud to Hybrid IT Straight From the Horse's Mouth - InfoWorld

How companies can boost their website in China’s clouded market – Cloud Tech

Company name translated into Chinese? Tick. Chinese social media accounts up and running? Tick. A Chinese-language website? Tick.

The IT team managing your Chinese website clicks deploy, and youre now on the way to conquering the lucrative Chinese market.

The euphoria of global expansion endures for a few days, but then the teething problems start stacking up. Website availability is patchy. No one is coming to your site except for spammers and existing customers who complain about slow load speed. Moreover, your website is lost eight pages deep on the Chinese equivalent of Google.

Website load speed is crucial anywhere in the world and especially in a mobile-centric market such as China. Hosting your website outside of China causes slower response times due to limitations on international bandwidth into China and high latency. This problem is especially acute for companies hosting their website in distant locations such as Europe and North America.

The most effective way to overcome these issues is to host your website in China. Hosting a website in China reduces site load time, minimizes latency, and is likely to improve search engine visibility in China over the long-term.

The other option is to deploy your website on a China-based Content Delivery Network (CDN). A CDN will cache your website on a distributed network of nodes. When a user in China requests access to your site, the CDN will serve a copy of your website from the closest node to the end-user. This dramatically reduces latency and is an ideal approach for companies that do not wish to migrate their origin server to a new location.

To deploy your website on a hosting server or CDN in China, you will first need to obtain an ICP (Internet Content Provider) license from the Ministry of Industry and Information Technology (MIIT). If your business is not eligible for an ICP license, an alternative is to deploy your website on a server or CDN node located in Hong Kong.

The scalability and built-in elasticity of the cloud are purpose-built for large online markets such as China, and businesses are realizing this advantage. Bain & Company predicts cloud computing sales to swell to 20% of Chinas total IT market by 2020, up from a mere 3% in 2013.

Given the massive number of online users in China dispersed across distant geographic locations, cloud hosting offers businesses the ability to maximize coverage and respond in real-time to sudden changes in traffic. This includes adding new deployment regions and availability zones and the option to release resources when traffic subsides after a promotional event or unexpected spike in activity.

Alibaba, for instance, hasbroken recordsduring promotional periods by leveraging the cloud to process up to 175,000 orders in just one second.

Chinas tech-savvy population is leading the way in adopting mobile payments, online-to-offline (O2O) services, mobile gaming, and designing their lives around their smartphone. While Android is the leading mobile operating system in China with approximately 74.4% of the market as of February 2017, its operating system can be susceptible to external attacks. To address mobile vulnerabilities and data security, companies need to carefully assess website and mobile security.

Cloud hosting provides access to a range of security products to protect your website from malicious attacks, including free services such as anti-DDoS protection and real-time monitoring. Advanced security products are also a must for commercial websites that integrate online payments. Cloud-based security products including server guard, mobile security, and web application firewall (WAF) can be integrated into your cloud architecture to protect against high volume DDoS attacks and other cyber intrusions.

To mitigate the threat of attacks, it is also vital to regularly update and backup your website. If your website is deployed on WordPress, this also means upgrading your WordPress theme and plugins to eliminate potential loopholes that hackers can exploit.

China is a highly competitive market, and consumers expect a smooth and secure online experience. The flexibility, scalability, and security offered by the cloud provides an optimal solution to boost your website in Chinas competitive online space.

Read the original here:
How companies can boost their website in China's clouded market - Cloud Tech

Cloud computing facility launched – Times of India

PATNA: Deputy CM Sushil Kumar Modi on Thursday launched Bihar Cloud, a virtual server in which applications and data related to 40 departments of the state government will be stored.

Simply put, cloud computing is the practice of using a network of remote servers hosted on the internet (the cloud) to store, manage, and process data, rather than a local server or a personal computer.

Launching the service at the headquarters of Beltron here, Modi said, "Cloud computing will enable us to store all government data and applications at one place. The facility has been developed considering the requirements of the state government till the year 2019. The IT department has already received hosting request of additional 25 portals or applications from 17 departments which have been approved."

IT department secretary Rahul Singh said storage capacity of Bihar Cloud is 200 terabytes with RAM capacity of 4200GB. Modi said progress has also been made in some ambitious projects, such as IT Tower at Dak Bungalow, IT Park at Bihta and IT City at Rajgir. The deputy CM said a proposal regarding acquisition of 25 acres of land at Bihta for development of the proposed IT Park will be tabled before the state cabinet soon.

"We are also in the process of acquiring 92 acres of land at Rajgir for developing an IT City which will provide a platform to IT and electronics manufacturing firms to set up their units," Modi said.

The announcement of the proposed IT City at Rajgir was first made by CM Nitish Kumar during the two-day long national level Information and Technology (IT) meet titled "e- Bihar - an emerging IT destination". Nitish had also announced that the state government would develop an IT Tower near Dak Bungalow crossing. "Bidding process is underway for selection of a consultant for development of the IT Tower by Infrastructure Development Authority," Modi said.

Here is the original post:
Cloud computing facility launched - Times of India