Category Archives: Cloud Servers
IT Monitoring in a Hybrid Cloud World – Virtualization Review
The Cranky Admin
Tracking what is going where now requires a completely different strategy.
One of the Web sites for which I am responsible is down. Determining why it's down is a bit of a journey. Just 10 years ago, figuring out what had gone wrong, fixing the problem and altering procedures to prevent recurrence would have been relatively easy. Today, however, hybrid IT is the new normal, and solving these sorts of problems can be quite complex.
Ten years ago I had all my clients hosting their Web sites on their own servers. On behalf of my clients, we ran email servers, DNS servers, caching, load balancing, intrusion detection, front-end, database, a box full of crazed squirrels, you name it. None of the datacenters I oversee are large, but at their peak, several of them ran a few thousand workloads.
This was in the days before desired state configuration and the "pets vs. cattle" debate. There were a lot of pets in these datacenters.
As you can imagine, workloads were sent into the public cloud. Web-facing stuff first, because it had a lot of infrastructure "baggage." Ever more mission-critical workloads moved until -- seemingly without anyone noticing -- the on-premises datacenter, the hosted solutions at our local service provider, and the public cloud workloads were scattered about the continent.
Despite the geographic dispersal of workloads amongst various providers, however, any given client's workloads remained critically conjoined. What was out in the public cloud fed into the on-premises systems, and everything had to be synchronized to the hosted systems for backups. If the wrong bit fell over, everything could go sideways.
This could be useful additional diagnostic information for me, or a separate fire to put out. I won't know until I'm a little further down the rabbit hole, but it is troubling.
Having spent years with pre-virtualized one-application-per-metal-box workloads, whenever something stops working my first instinct is to look for hardware failure. Today, that would mean seeing if the virtual servers, hosting provider or public cloud had fallen over.
A quick look see shows that I can connect to all the relevant management portals; the various management portals claim all the workloads are up and running. Unfortunately, I can't seem to log in to any of these workloads using SSH. This is alarming.
The hosting provider gives me console access to workloads -- something that, sadly, my public cloud provider does not -- and I am able to quickly assess that the various Web site-related workloads are up and running, have Internet access, and otherwise seem healthy, happy and enjoying life. They are not currently handing customers, which means that the switchover mechanism believes the primary workloads are still active.
I get an email on my phone, so something has to be working with the public cloud hosted workloads; part of the mobile email service chain lives there. I hop on Slack and ask a few of my sysadmin buddies to test my Web site. Some of them can get there, some of them can't.
While I pour coffee into my face and curse the very concept of 6 a.m., a phone call comes in from a panicked sales manager: only orders from one specific Web site have showed up in the points of sale system overnight. Five other Web sites haven't logged a single order.
Rather than drag you through each troubleshooting stage, I'll jump right to the end: the answer was DNS. More specifically, the outsourced DNS provider had a really interesting oopsie where half of their resolvers wouldn't resolve half of our domain names and the other half worked perfectly. This broke nearly everything, and we weren't prepared for it.
In the case of my early morning outage, because there was not actually anything wrong with the Web site, and the hosting provider provides a caching DNS server, the monitoring solution didn't see anything wrong. It could resolve domain names, get to the relevant Web sites, see email passing and so forth.
Back in the day when everything ran from a single site, this was fine. Either things worked, or they didn't. If they didn't work, wait a given number of minutes, then flip over to the disaster recovery site. Life was simple.
Today, however, there are so many links in the chain that we have to change how we monitor them. DNS, for example, clearly needs to be monitored from multiple points around the world so that we can ensure that resolution doesn't become split-brained. Currently none of our customers use geo-DNS-based content delivery for network-based regional Web site delivery, but it's been discussed. That would add yet another layer of monitoring complexity, but this sort of design work can't be ignored.
There is middleware that collects order tracking information from manufacturing, invoicing from points of sale, information from the e-stores and logistics information from the couriers. All of this is wrapped up and sent to customers in various forms: there are emails, desktop and mobile Web sites and SMS pushes. I think one client even has a mobile app. The middleware also tracks some advertising data from ad networks and generates reports.
Somewhere in there is email. Inbound email goes through some hosted anti-spam and security solutions. Outbound email comes from dozens of different pieces of software that will forward through smart hosts at various points until they are funneled through the main server located in the cloud. Email can originate from end users or from office printers, manufacturing equipment, the SIP phone system or any of dozens of other bits of machinery.
None of the clients I act as sysadmin for are currently more than 200 users. Most are in the 50-user range. None of the technology they have deployed is even as complicated as a hybrid Exchange setup or hybrid Active Directory.
Despite this, these small businesses are thoroughly enmeshed in hybrid IT. This multi-site, multi-provider technological interconnectivity means changing how we think about monitoring.
Hybrid IT is not a novelty. It's not tomorrow's technology. It's the everyday business of everyday companies, right now, today. Are you ready?
About the Author
Trevor Pott is a full-time nerd from Edmonton, Alberta, Canada. He splits his time between systems administration, technology writing, and consulting. As a consultant he helps Silicon Valley startups better understand systems administrators and how to sell to them.
Go here to see the original:
IT Monitoring in a Hybrid Cloud World - Virtualization Review
Cloud-native vendor consolidation key to container technology adoption – TheServerSide.com
As the enterprise Java space matured around the turn of the century, vendor consolidation quickly reduced the number of viable application server offerings. Stalwarts like JRun and Borland's Enterprise Server quickly became pass, and other application server providers were either bought or were overshadowed by the IBM WebSphere and BEA WebLogic offerings. Vendor consolidation in the Java EE space reduced the number of offerings to just two or three big vendors, with a couple of competitive open source offerings thrown in for good measure.
Software engineers are approaching development and enterprise design in an entirely new way, thanks to the cloud. In this expert handbook, explore how your peers are leveraging the cloud to streamline app lifecycle management, save money, and make production and security more efficient.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.
Today, almost 20 years later, the age of the server-side application server is said to be dead, if not, slowly dying. We are now living in a new age of stateless microservices writing to NoSQL databases, deployed into Docker containers that are hosted on virtual machines whose hypervisors are provisioned by pay-as-you-go clock cycles in the cloud. It's a brave new world, but it's a fragmented world as well, not dissimilar to the way things were when the enterprise Java specification was originally released.
Every evangelist with enough strength to stand atop a soap box is preaching the benefits of migrating to container-hosted microservices. Unfortunately, stepping through an online tutorial on how to create a Java-based microservice andsubsequently run it in Docker is merely a fun first step. Production-ready microservices that are deployed into a set of individual containers require quite a bit of plumbing if an enterprise expects to do cloud-native computing right.
First and foremost, there's the challenge of doing dynamic container orchestration. For reliability and stability, a cloud-native application needs monitoring and alerting. Troubleshooting becomes more complex when using the cloud, containers and hypervisors, because code can be running in any number of hosting environments, and those environments are scattered across the globe. For the same reasons, distributed tracing becomes a challenge, too. Service discovery, authenticating remote procedure calls (RPC) and the provisioning of container runtimes are just a few more of the challenges with which organizations who throw away their application servers in favor of a purely cloud-native future must grapple.
Fortunately, these early adopters of the cloud-native, container-based approach are not scrambling through some unknown wilderness alone. The challenges associated with cloud-native computing are well known, and ways to address those challenges are becoming increasingly well defined. The Cloud Native Computing Foundation (CNCF) hosts nine open source projects under their umbrella, each of which tackles a unique subset of challenges that organizations planning to deploy containers and microservices at scale might face, including:
As you can see, those who are going cloud-native are in very good company, with plenty of intellectual firepower helping them secure their beachhead based on microservices and containers.
There is nothing bad that can be said about any of these projects. However, it is difficult for even advocates to deny that when all of these projects are listed together, it becomes very intimidating to the middle manager who has to make important technology decisions. And it should be noted that this is simply the listing of projects that full under the purview of the CNCF. There are innumerable competitors in each of these spaces, whether they are separate open source projects, proprietary implementations or simply vendors building customized products on top of these aforementioned projects.
Technology aficionados love this type of disruptive, Wild West-type environment where multiple answers arise to each new problem that is encountered. But decision-makers hate it. This is why the future of this space is vendor consolidation.
Currently, making a cloud-based, container-backed, microservices environment work means choosing from many technologies. The big vendors in this space are looking at ways of hiding the names of the various projects that make cloud-native computing happen and, instead, blanketing those names with a well-established brand and logo. Decision-makers don't want no-name offerings, as they tend to create a great deal of uncertainty and risk. Instead, they want to simply be able to choose between Oracle and Red Hat, or between Microsoft and IBM.
Red Hat is certainly leading the way in helping to make the decision process easier with their OpenShift platform, as is Pivotal with their Cloud Foundry offering, but there are far too many competitors in this field, and too many subsegments to assert that any single one is leading the charge. Organizations like the CNCF, and vendors like Pivotal, will work hard to move the industry forward, but in the background, the big players like IBM, Oracle and Microsoft are looking to acquire a variety of technologies to produce a single offering that makes deployment easy, centralizes application management, boasts DevOps integration and provides high-level governance and policy enforcement. And what's funny is that this final offering will end up looking very much like what we've always known as a traditional, server-side application server. So much for those who prognosticated the enterprise application server's demise.
Don't let your microservices and Docker adoption be driven by fear
Amazon S3 failed -- and it wasn't a user input error that caused the outage
If you're working with Docker, you need the right tools for your container
See the rest here:
Cloud-native vendor consolidation key to container technology adoption - TheServerSide.com
Discover the world cloud servers market – WhaTech
Details WhaTech Channel: IT Market Research Published: 04 May 2017 Submitted by RNR Market Research. WhaTech Agency News from ReportsnReports - Industry Trends & Forecasts Viewed: 2 times
This market research report offers a complete breakdown of the global Cloud Servers market through exhaustive information on industry-authenticated market data, facts, statistics, and insights. An apt set of approaches and suppositions in the report back the market forecasts.
The report scrutinizes the market by an exhaustive analysis on market dynamics, market size, current trends, issues, challenges, competition analysis, and companies involved.
Complete report on Cloud Servers market spread across 105 pages, profiling 08 companies and supported with tables and figures is now available atwww.reportsnreports.com/contacts/938162
The report provides a basic overview of the industry including definitions, classifications, applications and industry chain structure.
Development policies and plans are discussed as well as manufacturing processes and cost structures are also analyzed. This report also states import/export consumption, cost, price, revenue and gross margins.
Key Manufacturers Analysis of Cloud Servers Market: Dell, HP, IBM, Oracle, Cisco, Fujitsu, Hitachi and NEC.
With tables and figures the report provides key statistics on the state of the industry and is a valuable source of guidance and direction for companies and individuals interested in the market.
Report:www.reportsnreports.com/938162
The report includes exceptional analysis and investment information across different countries and regions, along with various specific market trends. New project investment feasibility analysis, new project SWOT analysis, and contact information of industry chain suppliers is given out in the report for the clients assistance.
The market analysis in the global Cloud Servers report is tailor-made so as to find evolving trends and areas with high growth prospective within the industry.
...
Read more from the original source:
Discover the world cloud servers market - WhaTech
Keys to the Kingdom – Identity Week (blog)
Guest Post by Richard Pettit, Developer, Lieberman Software Corporation With the proliferation of Linux servers in the cloud comes the equally fast spread of SSH for connection to these cloud servers. SSH is not just a Secure SHell for connecting over the network. It is also a key and lock system for connecting to servers without the legacy login / password pair credentials that Linux and Unix users have used for years. And in many cloud environments, it is the only way to connect to these servers.
These keys and locks are the private and public keys that SSH uses for credentials. The private key is the key to the lock that is the public key. The public key can be derived from the private key. But the private key cannot be derived from the public key. The public key can be distributed openly. But the private key must remain closely held since it is the SSH equivalent of the password.
With users in possession of these private keys, which like passwords are not something you share with others, it is important to secure them to prevent access by attackers and other threats. It is also important to be able to rotate these keys, i.e. generate new keys to replace the old keys, especially for the privileged identities on these servers.
Whether key rotation is done on a periodic basis in line with policy from the CISO or in the event of a breach, having a system in place that will perform the task on a schedule or that can be used to react quickly to secure the guest is paramount. And, having this key rotation technology as part of the existing Privileged Identity Management (PIM) system makes the task of managing those identities all that simpler.
The crux of public / private key credentials is that the server has the public key and the client brings the private key to demonstrate that it is the legitimate privileged identity. The server can possess the public key. The private key can be held securely by the client and only taken out when connecting to the server.
It is common that the private keys are also stored on the server. But in the case of privileged identities, it is a security issue if a hacker can gain access to the private key. A PIM solution that stores the private key securely and only uses it when connecting to the guest adds another layer of security. And it removes an attack vector by eliminating the private key from the server.
As cryptography evolves, so do the cryptographic algorithms. Managing keys that use algorithms that have been defeated and sent out to the security pasture by the NSA or NIST is an important part of the PIM solution. Such keys must be upgraded either with keys of the same algorithm, but with larger bit length numbers (Ill skip the terminology), or with keys of a newer, more secure algorithm.
Identification of these old, insecure keys is an important part of a PIM solution.
Management of keys is not just a matter of rotating an existing key. It also includes upgrading keys to newer, more secure algorithms, discovering keys on servers, identifying insecure keys, retiring keys, creating new keys and propagating them to new servers. These capabilities and more are necessary to maintain proper ongoing security of Linux-based cloud servers. And that keeps the keys to the IT kingdom safe.
If you like this topic, please leave a comment below. You can also follow us on Twitter or subscribe to our RSS feed.
View original post here:
Keys to the Kingdom - Identity Week (blog)
Understanding the Cloud and making it work – WPBF West Palm Beach
WEST PALM BEACH, Fla.
Youve heard of the cloud, you might even use it, but do you understand it?
I think its storage? said Phoebe Reckseit. And its up there?
Actually, the cloud isnt a cloud or even in the sky. In the simplest terms, it means storing and accessing data and programs over the internet instead of on your computers hard drive.
The cloud is essentially a backup of all your data, said Josh Barnes, general manager of Experimac.
The cloud is a series of servers based around the world, that you send your data to, and those servers send it to other giant, heavily guarded servers, which keep the data securely stored for you.
But many think it sounds sketchy.
Barnes said a majority of his customers dont understand the cloud.
I don't know where it goes, where it's stored so I don't want to use it, said Barnes.
And you may wonder why so many companies are pushing cloud computing.
For one thing, because its a huge business companies like Google, Apple and Microsoft are making hundreds of millions of dollars on their cloud platforms.
Barnes said another reason is it saves you, their customer, the agony of losing everything when your phone or computer crashes, breaks, or is lost.
Especially pictures. If you lose them it's devastating, and most people don't back up their stuff, said Barnes.
But if your data is stored in the cloud, it can be retrieved. Everything from your contacts, your documents, to your prized photos, can all be restored.
Still, some are confused about how to actually use the cloud.
Barnes said it actually is fairly simple, but most people need someone to walk them through the setup.
Basically, you go to settings on your computer or device and click on the iCloud or Cloud icon.
With Apple devices, or on iTunes, you sign on with your Apple ID email, and then create a cloud password.
Its important that you use that same password for every device you want to sync with photos and information.
Then you turn on all the items you want to sync and save to the cloud, such as contacts, notes, photos and the like.
Youll probably want to upgrade your cloud storage, too. Apple gives you five free gigabytes, but most people will need 50 gigs to both backup their phones and to store their photos.
Apple charges 99 cents a month for 50 gigs, and pricing goes up from there for larger amounts.
You need to turn on all the same settings on all your devices.
Once youre set up, all your pictures and other items should sync automatically with each device.
One warning even though Apple says its storing your data, if you delete something on one device, such as pictures on your phone they do not remain in the cloud but are also deleted.
So if you run out of storage on your phone, you will still need to save photos and other items on either a hard drive, or a dedicated cloud storage.
Some, like Flickr, are free for basic storage. Others charge from $5 to $10 dollars a month on average depending on how much you need to store.
To find your contacts, mail and other information, log into icloud.com on the web.
Music remains on iTunes.
>>Download WPBF 25 News App: Apple IOS | Android
Android phones have similar set-ups.
There still remains the question, though is your stuff safe?
Experts say because your data is encrypted when you send it to the cloud, and then is stored in guarded warehouses in remote locations, its probably safe from hackers, or at least more so than your home computer.
But the government could legally ask to see your data once its in the cloud, and its up to the cloud providers to say yes or no.
Still, many say the benefits of up to unlimited storage, easy backups and syncing information outweigh the concerns.
Yes, absolutely, I use it myself and I wouldn't want anyone getting into my stuff, and I use it for everything, said Barnes.
You can find many online comparisons of cloud storage outlining the benefits and costs of each.
And if you still dont quite want to send your information up there you can rely on the old fashioned external hard drives and back up all your devices to those.
WEBVTT T MAKE YOUR LIFE EASIER?AND JUST HOW DOES IT WORK?TERRI: DO YOU USE THE CLOUD?>> YES.TERRI: AND DO YOU UNDERSTAND IT?>> NO.>> IT JUST DOESN'T MAKE SENSE.IT'S THERE AND YOU NEED TO HAVEMORE STORAGE AND YOU NEED MORESTORAGE AND YOU HAVE TO PAY FORMORE STORAGE.>> I THINK I USE SOME SERVICESTHAT USE THE CLOUD.I USE A CLOUD SERVIC>> I THINK IT'S STORAGE.TERRI: YOU THINK?>> I THINK.TERRI: YOU DON'T KNOW?>> WELL, EVERYTHING GOES TOICLOUD, SO THAT'S LIKE A CLOUDOF STORAGE.TERRI: THE CLOUD ISN'T REALLY ACLOUD.>> IT'S UP THERE.TERRI: NO, IT'S NOT EVEN IN THESKY.IN SIMPLEST TERMS, IT MEANSSTORING AND ACCESSING DATA ANDPROGRAMS OVER THE INTERNETINSTEAD OF ON YOUR COMPUTER'SHARD DRIVE.THE CLOUD IS THE INTERNET.>> THE CLOUD IS ESSENTIALLY ABACKUP OF ALL YOUR DATA.TERRI: THE CLOUD IS A SERIES OFSERVERS BASED AROUND THE WORLDTHAT YOU SEND YOUR DATA TO ANDTHOSE SERVERS SEND IT TO OTHERGIANT, HEAVILY GUARDED SERVERS,WHICH KEEP THE DATA SECURELYSTORED FOR YOU.BUT MANY OF US THINK IT SOUNDSSKETCHY.THAT'S WHAT CUSTOMERS TELL JOSHBARNES AT EXPERIMAC.>> I DON'T KNOW WHERE IT GOES,I DON'T KNOW WHERE IT IS STORED,I HAVE NO IDEA WHAT IS GOING ONWITH THAT.SO I DON'T WANT TO USE IT.TERRI: AND YOU MAY WONDER WHY SOMANY COMPANIES ARE PUSHING CLOUDCOMPUTING.WELL, FOR ONE THING, IT'S A HUGEBUSINESS.COMPANIES ARE MAKING HUNDREDS OFMILLIONS OF DOLLARS ON THEIRCLOUD PLATFORMS.BARNES SAYS ANOTHER ANSWER IS ITSAVES YOU, THEIR CUSTOMER, THEAGONY OF LOSING EVERYTHING WHENYOUR PHONE OR COMPUTER CRASHES,BREAKS, OR IS LOST. >> IF YOU LOSE THEM, IT'SDEVASTATING.MOST PEOPLE DON'T BACK UP THEIRSTUFF.TERRI: IF YOUR DATA IS IN THECLOUD, IT CAN BE RETRIEVED.FROM YOUR CONTACTS, YOURDOCUMENTS, TO YOUR ALL IMPORTANTPHOTOS.SO THAT'S ALL WELL AND GOOD, BUTSOME OF US ARE STILL CONFUSED.>> IT AUTOMATICALLY BACKS UP,BUT I DON'T KNOW HOW TO GET TOIT.TERRI: LET'S TAKE A LOOWE'LL USE APPLE'S VERSION -- THEICLOUD.>> THE FIRST THING YOU DO IS YOUGO INTO SETTINGS, RIGHT HERE.TERRI: CLICK ICLOUD THEN SIGN ONWITH YOUR APPLE I.D. EMAIL, ANDCREATE A PASSWORD.AND THIS IS VERY IMPORTANT.>> IF YOU WANT TO USE THE SAMEDATA ACROSS ALL DEVICES IT HASTO BE THE SAME LOGIN.TERRI: THEN YOU NEED TO CHECKALL THE THINGS LIKE CONTACTS YOUWANT STORED AND SYNCED IN THECLOUD.AND ONCE YOU DO THAT ON YOURCOMPUTER, YOU NEED TO MAKE THESETTINGS ON ALL YOUR DEVICES THESAME.YOU'LL PROBABLY WANT TO UPGRADESTORAGE, TOO.APPLE GIVES YOU FIVE GIGABYTES,BUT MOST PEOPLE WILL NEED 50 TODO A BACKUP AND PHOTOS.APPLE CHARGES $0.99 A MONTH FOR50 GIGS.ONCE YOU'RE SET UP, EVERYTHINGSYNCS AUTOMATICALLY.>> WE GOT BOTH PICTURES RIGHT ONTHERE.TERRI: YOUR PICTURES ON YOURPHONE ARE NOW ON YOUR IPAD, ANDIN THE CLOUD.SO ARE YOUR DOCUMENTS, CONTACTSAND APPS.BUT WHERE IS THAT CLOUD AGAI>> I HAVE IT JUST SO THAT ITGOES AUTOMATICALLY.BUT I DON'T KNOW HOW TO ACCESSIT.TERRI: JUST GO TO ICLOUD.COM.THAT'S WHERE YOU'LL FIND IT ALL.ANDROID PHONES HAVE SIMILARCLOUDS THAT STORE YOUR DATA.AND THAT BRINGS US BACK TO THEQUESTION OF IS MY STUFF SAFE?EXPERTS SAY BECAUSE YOUR DATA ISENCRYPTED AND STORED IN GUARDEDWAREHOUSES, IT'S PROBABLY SAFEFROM HACKERS.EVEN MORE SO THAN YOUR HOMECOMPUTER.BUT THE GOVERNMENT COULD LEGALLYASK TO SEE YOUR DATA ONCE IT'UP IN THE CLOUD AND THEN IT'S UPTO THE CLOUD PROVIDERS TO SAYYES OR NO.STILL, MANY SAY THE BENEFITS OFUNLIMITED STORAGE, EASY BACKUPSAND SYNCING INFORMATION OUTWEIGHTHE CONCERNS.>> YES, ABSOLUTELY.I USE IT MYSELF AND I WOULDN'T
Read the original:
Understanding the Cloud and making it work - WPBF West Palm Beach
Want to Speed Up Your Website? Try These 5 Cloud Server Applications – Entrepreneur
If youre looking for a way to boost your websites page speed and stop losing potential conversions, you should consider a cloud server.
Cloud servers can help your site load faster by using high-performing servers, lightning-fast processors and high-quality solid-state drive (SSD) storage. By going with a cloud server application, you can boost your sites SEO and customer satisfaction while letting the servers do all the work for you.
Most cloud servers also give you access to customer service resources to solve even your worst mistakes. Using a cloud server allows your hosting to grow with your business and stay secure. Here are five of the best cloud server applications you can use for an improved website.
If youre looking for a cloud server application that can give you high-performance hosting options without the high cost, LCN is the best option for you. LCN even offers free pure SSD storage and unlimited traffic at no extra cost.
LCN is great if youre just looking to try something out. It lets you buy a cloud server risk-free with a 30-day money-back guarantee and no contract. But, with their scalable servers and standard firewall feature, youll probably end up staying for longer.
Related:Habits of the World's Wealthiest People (Infographic)
Server Choice offers its customers cloud hosting, colocation servicesand PCI compliance to keep your website running smoothly. SSD VPS hosting thats powered by the cloud.
If youre wondering what all that gibberish means, it'sbasically a server equipped with cloud-powered infrastructure and free server management that will speed up your site. It also comes with awesome features like free cPanel license, CentOS and LAMp stack. It even allows manual backups.
Related:Inspiring Quotes to Help You Get Through Your Work Day
With Bluehost, you know youre going to be working with a company you can trust. It's been in the cloud hosting business since 2003, working with small business owners and individuals, and boasts avariety of packages with plenty of room to upgrade and expand.
Not only is Bluehost reliable, but it has a full set of security features as well. Bluehost also provides its users with a decent amount of flexibilityand an unlimited amount of email addresses, MySQL databases, domains and subdomains you can create.
DreamHost boasts a simple and easy-to-use platform that even the most novice of designers wont be intimidated by. It'sbeen inbusiness since 1997 and has the ability to host everything from small personal blogs to entrepreneur sites.
Related:5 Habits of the Wealthy That Helped Them Get Rich
What makes GreenGeeks special is that it can make your website load incredibly fast while also helping out the environment.Its mission is to provide environmentally friendly cloud hosting services -- it'seven a certified EPA Green Power partner.
It's a great choice for those concerned about secure cloud hosting as it offers excellent security features, too,and is available in 150 countries. It also has excellent, 24/7 customer service and a 30-day money-back guarantee.
Cloud server applications are an excellent way to improve the speed of your businesss website while getting a variety of other features. If youre looking for a hassle-free and budget-friendly cloud hosting service, try one of these five companies for a faster website.
Nathan Resnick is a serial entrepreneur who currently serves as CEO of Sourcify, a marketplace of the world's top manufacturers. Having brought dozens of products to life, he knows the ins and outs of how to turn ideas into realities.&n...
See the article here:
Want to Speed Up Your Website? Try These 5 Cloud Server Applications - Entrepreneur
Evolving options for on-premises data centers – GCN.com
Evolving options for on-premises data centers
Ever since cloud computing became a realistic option for large operations, it has dominated the data center conversation. Efforts to consolidate data centers and cut the related operations and maintenance costs have pointed to cloud as a potential option. There are, however, instances when using on-premises computing can be beneficial and necessary. And the terminology in the space has so much overlap that agencies operating an on-prem site often call it a cloud.
So what, exactly, defines a data center? Cameron Chehreh, Dell Federal CTO, said the federal government will often call a couple of servers in a closet a data center, but that skews the more commercial definition: A centralized location that houses an organization's computing, storage and applications. David McClure, chief strategist at Veris Group, said a simple definition is a big building full of servers.
A cloud, meanwhile, might be described as someone elses servers. There's significantly more to it than that, but it is a remote data center that is usually managed by someone other than the organization or agency storing its data there. This is the space filled by Amazon Web Services, Microsoft, Google and many others. I think of cloud as locationless computing, McClure said. It is on demand, as needed and priced according to use.
Blurring the lines further is private cloud, which can mean a few different things. If you put three cloud specialists in the room you may get twelve separate answers on what a private cloud is, Chehreh said. Christian Heiter, CTO of engineering at Hitachi Data Systems Federal, agreed with the lack of agreement on what exactly private cloud is. This is where the definition gets a little fuzzier -- it's a term of art, he said.
The glib definition is that private cloud is just a data center with a fancy name to make it sound like whoever is building it is hip to the trends. McClure, however, said that isnt necessarily accurate. For an on-premises data center to truly be considered a private cloud, it must implement modern technologies like virtualization and scalability, he said. A private cloud could also be a remote data center with a single tenant.
The Army, for example, has recently begun working with IBM to build a consolidated private cloud solution at Redstone Arsenal near Huntsville, Ala. IBM is building and will manage the facility, but it will be an on-premises facility devoted to the Army's needs. The first phase of the project aims to put 38 applications into the cloud and meet the Defense Information Systems Agencys Impact Level 5 -- the highest security level for unclassified data.
McClure was working in the federal government during the Obama administration when the first push to cloud began. Federal officials, including then-federal CIO Vivek Kundra, recognized the opportunity to lower operations and maintenance costs by consolidating on-premises data centers and moving some operations to the cloud. Yet a decade later, were still in the crawl-walk phase of cloud computing, McClure said.
Efforts are being made to make the switch, though, and Dave Powner, director of information management and technology resources issues at Government Accountability Office, is a longstanding proponent of that effort. It is really contingent on how well agencies can optimize what they have, Powner said about the move to cloud.
In Powner's view, however, that optimization has been fairly limited. An Office of Management and Budget memo said the average government server is utilized at just 9-12 percent of its capacity. That was really the impetus to start this data center consolidation effort, he said. The goal is to get that number up to 65 percent.
A 2014 GAO report on the governments consolidation efforts estimated that the Treasury Department avoided more than $577 million in costs through consolidation between 2011 and 2013. Other agencies have also seen tens of millions in savings, the report said.
Some agencies have made big strides, he said. The Departments of Agriculture, Treasury and Justice; the General Services Administration; and NASA have all closed 50 percent of their data centers. The Defense Department has not yet hit 50 percent, but it has closed 700 facilities, Powner said.
I think there is more of an acceptance that you can meet some of the security requirements through cloud offerings, Powner said, attributing that shift to the examples set by early adopters, which have allowed others to see implementations that actually work.
Yet while cloud promises cost savings and flexibility, the future of a government without data centers is nowhere in sight. I dont necessarily see everything moving to cloud," said Sophia Vargas, an infrastructure analyst at Forrester. "I think it's kind of stuck -- potentially for a long time -- in more of a hybrid, multistate.
That seems to be the consensus. At least for the foreseeable future, hybrid solutions will define the data center, and the inner workings of those data centers are changing accordingly.
How hybrid helps
Vargas colleague Richard Fichera, Forrester's vice president and principal analyst of infrastructure, said the simple definition of a hybrid data center is exactly what is sounds like: using both enterprise and cloud solutions for data storage. Using a local data center and one or more cloud services can provide a best-of-both-worlds scenario that can reduce cost and lead to the consolidation that Powner seeks.
People are starting to find balance, Chehreh said. There is nothing but a bright future for hybrid moving forward.
Originally posted here:
Evolving options for on-premises data centers - GCN.com
Core blimey! 10000 per rack in startup’s cloud-in-a-box – The Register
+Comment Say hello to hyperdense server and NVMe storage startup Aparna Systems and its Cloud-in-a-Box system.
Originally named Turbostor and founded in February 2013, the company has emerged from stealth with the Orca Cloud, a 4U enclosure that converges compute, storage and networking, and offers, Aparna claims, up to 10,000 cores in a rack. The 4015 version has 15 Servers and the 4060 has up to 60. So, have ten of these in a 42U rack and that means 1,000 cores per 4U 4060 box.
The Orca Server is packaged in a box or cartridge-sized like a 3.5-inch hard disk drive, draws less than 75 watts, and comes in two variants: Oserv8 with 8-core Broadwell Xeons, and Oserv16 with 16-core ones.
Oserv16 Server
Both have DDR DRAM, and dual SATA or NVME SSDs. We're told "storage IO is non-blocking based on its support for both SATA at 12 Gbps (6 Gbps per SSD) and NVMe at 64Gbps (32 Gbps per SSD), with latencies of 100 microseconds (s) and 10s, respectively."
The Cloud systems enclosure is NEBS-compliant and features a fully fault-tolerant, non-stop, non-blocking performance, with:
It can be used as a bare-metal system or for running virtualized or containerized environments.
Orca Cloud 4060
Aparna claims its compact convergence of compute, storage and networking, when compared to existing clusters made from rack-level systems or blade servers, can mean an up to 40 per cent CAPEX and OPEX saving, lower space needs and an electricity draw reduction of up to 80 per cent.
It's marketing the Orca systems to service providers and enterprises for mission-critical applications from the edge to the core. The startup is casting its net far and wide, saying Orca's "open software architecture and non-stop high performance make the Cloud-in-a-Box suitable for virtually any networking, computing or storage application, including fog and multi-access edge computing, databases, data analytics, the Internet of Things, artificial intelligence and machine learning."
CEO Sam Mathan says he expects "the Cloud systems [to be] especially popular for edge computing and aggregation applications. Of course, these same capabilities are also important in the enterprise, where the ability to scale compute and storage resources is often constrained by available data centre space and power."
Other execs are CTO Alex Henderson, a co-founder of Turbostor in March 2013, as was Ramana Vakkalagadda, director for software engineering.
Aparna investors include:
We don't know the funding amounts but Divergent reportedly put in $500,000.
Orca systems and server table
This hopeful killer whale of servers seems to rely on innovative packaging to produce its core density, high per-server bandwidth and precise event timing. It must also have ferociously efficient cooling technology in its 4060 version with 650 x 16-core Xeons milling around in there.
We don't know the storage capacity but suspect it's not much, given the space constraints inside an Orca Server. We don't know if 2.5-inch or M.2 firm factor flash storage is being used either. Aparna simply does not say how much SSD capacity there is.
Nor do we know much about its sales channel. The company has just come out of stealth, has a few customers and what looks like a hot box with neat timing features suitable for carrier types as well as dense server packaging.
We think the funding needs for such a hardware-focussed startup must in the $5m to $10m area, if not more. Aparna now has to sell a fair number of systems and get set for an A-round of VC funding. It has a mountain to climb as it establishes itself, with Cisco, Dell, HPE, Huawei, Lenovo, Supermicro and others selling servers against it. If Aparna can pack server CPUs this close together then, surely, so can they.
Orca Cloud 4015 and 4060 systems, and the Oserv8 and Oserv16 Servers are all available for customer shipment. Pricing for entry-level configurations of the Cloud model 4015 system begins at $49,500.
Read more here:
Core blimey! 10000 per rack in startup's cloud-in-a-box - The Register
Data center market suffers as more organizations opt for the cloud – Health Data Management
Cloud computing continues to take a bite out of the data center market, as new projections from Gartner Inc. show barely any spending growth for the year.
Worldwide IT spending on the data center system segment is expected to grow by only 0.3 percent this year, Gartner says. Despite that low number, it is better news that last year, which actually saw negative growth.
Also See: FBI warns healthcare organizations to check FTP servers
We are seeing a shift in who is buying servers and who they are buying them from, explained John-David Lovelock, research vice president at Gartner. Enterprises are moving away from buying servers from the traditional vendors and instead renting server power in the cloud from companies such as Amazon, Google and Microsoft. This has created a reduction in spending on servers which is impacting the overall data center system segment.
Of the five spending areas studied by Gartner, only communication services had a lower projection for 2017, at negative 0.3 percent growth. Spending on devices is projected to see a 1.7 percent increase and enterprise software is expected to lead at a 5.5 percent increase.
Worldwide IT spending is projected to total $3.5 trillion in 2017, which represents a 1.4 percent increase from 2016, according to Gartner. Heading into 2017 Gartner had originally projected a 2.7 percent spending increase. The research firm adjusted its projection down due to the rising value of the U.S. dollar against foreign currencies.
David Weldon is the editor-in-chief of Information Management.
The rest is here:
Data center market suffers as more organizations opt for the cloud - Health Data Management
How hybrid cloud is strengthening Fitness First – ZDNet
Hybrid cloud has helped Fitness First drive development efficiency.
There's never been a better time to start a business: whereas in the past you'd have needed to build your own data centre and fill it with IT infrastructure, now major cloud service providers like Amazon, Microsoft and Google can provide you with all the services you need to run an online business. To take just one example, Airbnb runs many of its services on AWS.
But what about more established enterprises? It's not viable for them to suddenly pack up all their data centres and move everything into the public cloud. But there is a happy medium: hybrid cloud. This approach enables organisations to take advantage of cloud services while also harnessing the power of on-premises platforms in a way that provides enough flexibility to deal with any sudden demands.
One organisation that has opted for this hybrid approach in an effort to modernise and improve their IT strategy is Fitness First, the gym and health club operator. Founded in the UK in 1993, Fitness First has grown to become one of the largest fitness brands in the world, with over one million members across 370 clubs in 16 countries.
But in the 24 years since Fitness First began operating, technology has changed dramatically, and so have the needs of the business. That's why the company decided to examine its infrastructure and how it was used -- especially as the cost of physical servers escalated as the firm expanded.
"The issue the organisation faced is that it has historically gone down a physical server, physical data centre setup -- a lot of investment over the years into tin. What it ended up with was a lot of hardware which was now getting out of date," explains Jon Forster, Consulting Senior IT Advisor at Moray Limited, the holding company that owns the Fitness First Group.
The company was beginning to struggle with the flexibility required in order to make make changes while also managing costs. "We wanted to change to IT being more of an enabler rather than a reason for things to be slow. We'd really hit a roadblock," says Forster.
It's for that reason that Fitness First looked towards hybrid cloud, to provide "that flexibility to increase or decrease the computing power you need at that moment in time," he says.
For example, tasks such as application development or work around the website and coding only need to be powered at very specific times; the rest of the time the servers dedicated to these tasks are doing nothing. "They have no value until you want them back," says Forster.
So Fitness First began looking for a hybrid cloud provider that did all this and also tied into its Microsoft Azure-based infrastructure. One of the key demands of the new service, explains Forster, was the ability to be "flexible within your own environment without additional tin." Discussions with colleagues in the industry led Forster to enterprise cloud provider Nutanix.
The Cloud v. Data Center Decision
This ebook, based on the latest ZDNet/TechRepublic special feature, takes a close look at the current enterprise trends surrounding cloud migrations.
"They deliver what we want; it's tied into Microsoft and I can have no real gap between where we host things and there's no need for multiple technologies," Forster says.
So Fitness First opted to work with Nutanix and set up a hybrid cloud server in a matter of weeks. Forster was impressed enough to expand the relationship after just a few months.
"They worked with some of our guys and set it up very quickly. In fact within about 10 weeks of buying the first three blocks, it was going to well I bought another one in order to put everything on Nutanix," he says.
By shifting towards a partially cloud-based model, Fitness First has "completely changed" the way it does back end development, says Forster. "It suddenly becomes really quick. It changes so many ways -- having to look up backups has just gone away as you push it all up into Azure and it just tells you where and how it worked".
The ultimate benefit to Fitness First is that it's now able to spin services up or down in reaction to increases in demand -- be that during a day-to-day basis or at the times of year when there's a surge in new members.
"The business runs in peaks and troughs. But now it's got enough compute power which you can wind back whenever you want, so when we have those peaks, it's absolutely fine, it doesn't slow down," says Forster.
The new setup also enables Fitness First to build new features of alter existing ones much more quickly than was previously possible, he explains.
"It's now quick so it's not a case of asking for something and getting it 12 months later; it's much quicker. You can give anyone the ability to get on and do it straight away and not be held back. That's helped everybody".
Naturally, switching to a cloud-based model has also saved Fitness First costs in terms of running and operating physical data centres.
"It's much cheaper than we had before. Before we had about 18 racks of kit and now we have two half-racks -- that's it. In terms of cost of space, of power, that's totally collapsed," Forster says, adding that those funds can be used to improve customer service and feed into the bottom line.
So what's the main thing organisations looking to invest in hybrid cloud infrastructure should be doing? It's important not to be nostalgic about your old physical data centres, and to look to move forward, according to Forster.
"Take a step away from what you have and don't base everything on that -- base it on the need for the future. Don't just keep slowly changing what you have, or it'll end up costing you a lot more money," says Forster.
Read the rest here:
How hybrid cloud is strengthening Fitness First - ZDNet