Category Archives: Cloud Servers
Cohesity branches out data management software to ROBO and the edge – Blocks and Files
Cohesity has upgraded its data management software to cover remote office branch office (ROBO) and edge IT sites, in a single environment along with central data centres.
Cohesitys hyperconverged secondary storage platform offers backup, golden master file copies for test and dev, compliance and other data users, archiving, tiering to the cloud and general file and object storage.
Its market sweet spot to date has been hybrid, covering business data centres and the public cloud. Now it is extending from the data centre base to branch offices and data-generating IoT edge locations. The software runs as a virtual appliance or on certified HPE and Cisco servers.
Vineet Abraham, Cohesity SVP for engineering and product management, said in a statement: By offering the enterprise-class features of Cohesity software in a cost-effective, plug-and-play solution, we are empowering organisations to bring their data centre, cloud, and edge together on a single platform.
Cohesity ROBO and edge users get instant mass restores to recover an entire branch when needed. Cohesitys dedupe and compression reduces bandwidth utilisation when sending/receiving data to/from central or cloud sites. Cohesity also suggests its file and object services can replace local Windows file servers.
For Cohesity the edge includes branches of national banks, retailers, chain restaurants, rental car agencies, warehousing and distribution, pharmaceutical, and global IT services. Data from these sites typically need to be managed, protected, and secured locally, without dedicated IT staff to handle on-site administration.
Cohesity cites IDC research that shows more than half of enterprise data will be created and processed outside the data centre or cloud by 2022. According to the company, this underscores the need for a unified approach covering ROBO, the edge, enterprise data centres and public cloud.
Cohesitys ROBO offering will be generally available to customers from Cisco and HPE by Spring 2020.
Read the original:
Cohesity branches out data management software to ROBO and the edge - Blocks and Files
Data Center REITs: Battle Of The Clouds – Seeking Alpha
REIT Rankings: Data Centers
In our REIT Rankings series, we introduce and update readers to each of the residential and commercial real estate sectors. We focus on sector-level fundamentals, analyzing supply and demand conditions and macroeconomic factors driving underlying performance. We update these reports quarterly with a breakdown and analysis of the most recent earnings results.
(Hoya Capital, Co-Produced with Brad Thomas through iREIT on Alpha)
Data Centers REITs are the home of the "cloud," the physical epicenter of the internet. Within the Hoya Capital Data Center Index, we track the five largest data center REITs, which account for nearly $100 billion in market value: Equinix (EQIX), Digital Realty (DLR.PK), CyrusOne (CONE), CoreSite (COR), and QTS Realty (QTS). While not included in the index, business storage operator Iron Mountain (IRM) also operates a relatively small portfolio of data centers as well as non-REIT Switch Inc (SWCH).
One of the newer REIT sectors, Data Center REITs have been perennial outperformers since bursting onto the scene in the middle of the last decade, serving as one of the primary growth drivers of the commercial real estate sector. Data Center REITs comprise 4-12% of the broad-based Core REIT ETFs, but comprise roughly a third of the Benchmark Data & Infrastructure Real Estate ETF (SRVR), which also includes cell tower and billboard REITs. SRVR was one of the best-performing real estate ETFs in 2019 - despite its relatively low dividend yield of around 1.5% - driven by the 44% average total returns from these data center REITs, outpacing the 28% total return Core REIT ETF average.
As is common across most real estate sectors, the companies providing the behind-the-scenes infrastructure and real estate are not "household names" compared to their more consumer-facing tenants. The companies synonymous with cloud computing, Amazon (AMZN), Microsoft (MSFT), Google (GOOG), Alibaba (BABA), and IBM (IBM), are among the largest and most important tenants of these data center operators, and have become even more critical tenants in recent years as a growing share of leasing activity has accrued to a smaller handful of tenants. Amazon owns roughly a 32% market share according to Canalys Research, outpacing the 18% share from Microsoft, 6% share from Google, and 5% share from Alibaba.
External growth has been the modus operandi for these companies as Data Center REITs have been relentless developers and acquirers over the last half-decade. Consolidation remains a continuing theme in the data center sector as these data center operators attempt to fend off mounting competitive pressures from their ever-powerful tenants. Data Center REITs own roughly 30% of investment-grade data center facilities in the US and command roughly a fifth of data center capacity globally. Outside of these five REITs, other companies operating in the space include a mix of international, private, and "c-corp" entities, including Zayo Group (ZAYO), Switch (SWCH), Flexential, Cyxtera, TierPoint, and Cologix.
Data center REITs operate in three primary lines of business: wholesale, colocation, and interconnection. The value of each data center is largely a function of its position along the internet backbone, the physical fiber-optic network that links every connected-device across the world. Properties within the backbone, or more precisely at the "intersection" of various networks, are able to provide higher-value network-based colocation and interconnection services, which command higher rent-per-MW and generally have significantly higher barriers to entry due to the inherent "network effects." Properties on the periphery or those lacking a critical mass of interconnection tenants typically provide more ubiquitous enterprise-based wholesale services, including storage and cloud-based software applications and primarily rent these facilities to wholesale customers who pay lower per-SF rent.
The competitive landscape, particularly in the lower-barrier wholesale data center market, is shifting as these hyperscale providers are responsible for a steadily growing share of total leasing activity. These "public cloud" providers continue to build out their enterprise software suite, allowing enterprises to ditch their independent managed servers within the data center facilities and instead operate exclusively through a public cloud. Effectively, more customers are able to forego any direct relationship with these data center operators and contract instead with one of the handful of "public cloud" operators. Digital Realty expects half of all data center servers to be operated by just a half-dozen hyperscale tenants by 2021, up from 25% in 2018.
Responding to the mounting competitive threats posed by "hyperscale" giants - Amazon, Microsoft, and Google - data center operators have turned to M&A to regain some degree of pricing power, with a particular focus on the higher-value interconnection-focused facilities. Digital Realty significantly expanded its interconnection and colocation business through its Interxion acquisition but remains a mostly wholesale-focused entity with roughly two-thirds of revenues coming from that lower-barrier business line. Interconnection, which relies on "network effects," can translate into a competitive advantage owned by REITs that hyperscalers have more difficulty replicating. Equinix has the highest "quality" portfolio of network-dense assets followed by the smaller CoreSite. CyrusOne, QTS, and the majority of non-REIT data center operators focus primarily on more competitive wholesale assets.
Typically housed in windowless industrial-style buildings surrounded by massive generators and cooling equipment, data centers provide the critical infrastructure - power, cooling, and physical rack space - to a variety of enterprise customers with different networking and computing needs, who generally install and manage their own server and computing equipment in the facilities. Housing millions of terabytes of mission-critical data for thousands of individual customers, physical data security and operational reliability are crucial attributes of data center facilities. As noted above, data center REITs have been among the fastest-growing REIT sectors, but are also some of the most "expensive" and the lowest-yielding companies across the REIT space.
As discussed in our REIT Decade in Review, at the real estate sector-level, three themes dominated the 2010s: 1) The Housing Shortage, 2) The Retail Apocalypse, and 3) The Internet Revolution. Producing an annualized rate of return that was more than double that of the broader REIT average from 2015-2019, data center REITs have ridden the thematic growth trends associated with the boom in outsourced IT spending, and have been the third-best performing REIT sector over the past five years, trailing only manufactured housing and cell tower REIT sectors.
Heading into 2019, data center REITs were coming off an uncharacteristically weak year - dipping 14% in 2018 - but produced total returns of 44% in 2019. 2020 has picked off nearly exactly how 2019 left off with the "eREIT" sectors (Data Center, Cell Tower, Industrials) leading the charge yet again. Gaining 10% so far this year, the Hoya Capital Data Center Index has outpaced the 6.5% gains on the broader REIT average. This outperformance comes despite signs of slowing growth in global IT spending and choppy leasing activity from the hyperscale providers that we'll discuss in more detail below.
Network-dense portfolios produced superior performance in 2019, led by Equinix, which surged nearly 70%. Wholesale-heavy REITs including longtime sector stalwart Digital Realty were laggards last year as competition and pricing pressure from hyperscalers remain intense, while Digital Realty was also pressured by a luke-warm reaction to their plans to acquire InterXion - particularly from INXN's shareholders. QTS has been the leader out of the gates so far in 2020 following solid earnings results, while CyrusOne has lagged despite a nearly 8% pop last week after the firm reportedly retained Morgan Stanley after receiving M&A interest. In January, CyrusOne announced plans to cut 12% of their workforce, citing "continued moderation in demand from hyperscale customers."
For all the focus on M&A possibilities, the performance of the data center REIT sector continues to be at the mercy of the quarterly net leasing activity figures - the most closely-watched metric for the sector. Data center leasing activity surged in the first-half of 2018, but dipped sharply into the end of the year, dragging with it the stock prices of these REITs. While still choppy, leasing bounced back nicely in 2019, as have the REIT stock prices, and this past quarter's results were generally in line or slightly better the estimates. The $116 million in net incremental annualized revenues was a 45% jump from 4Q18, bringing the full-year leasing activity among these four REITs to $495 million, down only slightly from the $512 million in 2018.
Solid leasing data was a welcome relief given the continued uncertainty over global IT spending - particularly in the European and Asian markets - and over domestic hyperscale demand. In their most recent forecast in January, Gartner revised lower their 2019 estimates and their 2020 Worldwide IT Spending forecast. Gartner tracked just a 0.5% rise in overall global IT spending, the slowest rate of growth since the recession, but sees a pickup in 2020 to 3.4% growth. The Data Center and Enterprise categories, reflecting the critical drivers of data center spending, are expected to reaccelerate slightly next year after significantly slowing growth in 2019.
Digital Realty and QTS were the relative standouts in 4Q19 with solid beats on leasing figures, combining for $97 million of the $116 million total. CyrusOne and CoreSite, however, fell shy of leasing estimates. From an AFFO standpoint, Digital Realty, Equinix, and CyrusOne all topped prior guidance in the fourth quarter while QTS fell just shy, but the story has been the continued deceleration in AFFO growth - particularly among the wholesale-focused REITs - since the middle of the decade. Escaping the wholesale hyperscale weakness has been Equinix, which led the way with 10% AFFO per share growth in 2020 and expects to see 8% growth in 2020.
While much of the investment community remains hyper-focused on leasing metrics, which we see as volatile and prone to false signals, we remain focused on re-leasing spreads as the key forward-looking indicator of underlying pricing power and on supply/demand conditions as an indicator of any emerging barriers to entry, which we have not yet seen to any significant degree. Digital Realty, which we view as the industry bellwether, reported a 0.6% decline in cash renewal spreads, falling back into negative territory after recording its strongest reading since 4Q15 last quarter. DLR reported a 4% decline in "same capital" NOI growth in 2019, reflecting the continued (and perhaps underappreciated) competitive challenges facing the data center sector, particularly the wholesale/hyperscale business lines.
Overall, while industry average revenues and EBITDA grew roughly 10% in 2019, AFFO per share rose at a more modest 5.5% in 2019, roughly consistent with last year's 5.3% achieved AFFO per share growth rate. 2020 guidance was generally in-line with analyst estimates but call for a continued deceleration in AFFO per share growth to an average of 3.2%. (DLR plans to provide 2020 guidance after the InterXion acquisition.) The interconnection-focused Equinix continues to be the relative standout, while the smaller wholesale-focused REITs continue to see decelerating growth. We expect the trend of interconnection outperformance to continue for the foreseeable future, but will be interested to see whether consolidation will begin to stabilize the downward pressure on same-store pricing on wholesale leasing.
Size and scale have proven to be competitive advantages in the data center space, and these REITs have used acquisitions as a means to stay in front of competitive threats from hyperscale providers. Digital Realty shook the data center landscape last October with its announced $8 billion acquisition of European data center giant Interxion (INXN), the eighth largest operator in the world. The fourth major acquisition for DLR since 2015, the firm acquired Telx in 2015, fellow REIT DuPont Fabros in 2017, and Ascenty in 2018. The combined entity will own more than 275 data centers and earn close to $4 billion in annual revenues in 2019. While not reflected yet in the chart below, the $8.4 billion deal will be the largest data center transaction ever, topping the $7.6 billion DFT deal in 2017.
While weak pricing power is nothing new for data center REITs, it becomes more of a concern as external growth rates begin to naturally cool across the sector following several years of above-trend growth. The development pipeline has come back down to Earth over the last few quarters after briefly exceeding $3 billion at the end of 3Q18 and finishing 2018 at $2.9B. While development remains fairly disciplined and responsive to demand, it's unclear whether there are any real barriers to supply growth in the wholesale segment, a segment flush with cash from the hyperscale giants.
As they have for most of the past half-decade, data center REITs continue to trade at premium valuations to the REIT averages based on Free Cash Flow (aka AFFO, FAD, CAD) based metrics. Powered by the iREIT Terminal, we note that data center REITs have seen some of the fastest rates of FFO growth over the last five years at roughly 8%, but we expect growth to be closer to the REIT average over the next half-decade. As noted above, data center REITs trade at an estimated 20-30% premium to NAV. Maintaining this NAV premium is critical to accretively funding these REITs' external growth ambitions and maintaining a critical cost of capital advantage over private market competitors.
Data Center REITs pay an average dividend yield of 2.3%, which is below the REIT sector average dividend yield of around 3.4%. Data center REITs pay out just 50% of their free cash flow, leaving them ample capacity to increase dividends or reinvest in growth. In our recent report, "The REIT Paradox: Cheap REITs Stay Cheap", we discussed our study that showed that lower-yielding REITs in faster-growing property sectors with lower leverage profiles have historically produced better total returns, on average, than their higher-yielding counterparts.
Within the sector, we note the differences in yield for these five REITs and an estimation of their approximate payout ratios. CoreSite yields a sector-high of 4.2% followed by Digital Realty at 3.3% and CyrusOne at 3.0%. Equinix remains the most "growth-oriented" REIT, paying a yield of just 1.6% but retaining more than 60% of free cash flow.
Business spending on cloud infrastructure is still in its infancy, as nearly 75% of global IT spending is still on traditional IT. According to IDC, cloud deployment is expected to steadily accelerate over the next decade, and by 2020, more than 50% of IT spending will be on cloud-based infrastructure. The economics of cloud deployments are expected to remain highly favorable for the foreseeable future. While our base-case is that data center pricing remains soft due to intense competition from hyperscale providers, Data Center REITs may be able to retain pricing power through consolidation if or when barriers to supply growth develop. We believe that scale is a competitive advantage that may lead to accretive acquisition-fueled growth. Below, we outline five reasons that investors are bullish on the data center space.
Flush with cash, "big-tech" has invested enormously over the last five years in enterprise cloud services and building out network capacity, primarily by leasing massive quantities of space from these data center REITs. While certainly a short-term win for these REITs, these "public cloud" offerings are increasingly winning business from larger corporate customers that may have historically deployed a more traditional hybrid cloud solution that involved these clients renting space directly from these data center REITs. While Digital Realty only projects out to 2021, we see hyperscale players commanding a growing share of total data center traffic and processing power and that more firms will work more exclusively in the "public cloud." We see the industry evolving into a model more akin with the cell tower REIT sector whereby a small number of "carriers" have an effective duopoly or triopoly due to the cost advantages of scale and network effects.
Data Center REITs - the physical epicenter of the "cloud" - continue to ride the substantial secular tailwinds behind the big-data" and cloud computing boom, surging nearly 50% in 2019 and are off to another hot start in 2020. Storm clouds have been building around the high-flying technology-focused sector, however, as intense competition and furious supply growth have weakened pricing power, underscored by Digital Realty's 4% decline in "same capital" NOI growth in 2019.
Responding to pressure from the hyperscale giants Amazon, Microsoft, and Google data center operators have turned to M&A to regain pricing power, a trend we expect to continue. As we told National Real Estate Investor in January, "Given the favorable cost of capital enjoyed by these REITs, we see more consolidation likely in the early 2020s, and would be shocked if there were still five REITs in their current form by 2025. Given the recent "merger mania" trends in the REIT sector over the past month, we'd say that at least one acquisition is likely by the end of 2020.
While we expect continued robust demand for data center space, the outlook for the REITs themselves remains cloudy given the increasingly competitive landscape. With negative same-store growth, these companies are highly dependent on external growth, underscored by the moderation in AFFO growth in 2020 to levels below the broader REIT average. That said, we believe that incremental demand associated with the "next generation" of cloud computing - applications like artificial intelligence, the internet of things, and augmented reality - could shake-up the competitive dynamics in a way that could ultimately benefit these REITs and will be the "wild-card" that will determine if the 2020s are as strong as the 2010s for the data center REIT sector.
If you enjoyed this report, be sure to "Follow" our page to stay up to date on the latest developments in the housing and commercial real estate sectors. For an in-depth analysis of all real estate sectors, be sure to check out all of our quarterly reports: Apartments, Homebuilders, Student Housing, Single-Family Rentals, Manufactured Housing, Cell Towers, Healthcare, Industrial, Data Center, Malls, Net Lease, Shopping Centers, Hotels, Billboards, Office, Storage, Timber, and Real Estate Crowdfunding.
Hoya Capital is excited to announce that weve teamed up with iREIT to cultivate the premier institutional-quality real estate research service on Seeking Alpha!Sign-up for the 2-week free trial today!
Disclosure: I am/we are long DLR, COR. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.
Additional disclosure: Hoya Capital Real Estate advises an ETF. In addition to the long positions listed below, Hoya Capital is long all components in the Hoya Capital Housing 100 Index. It is not possible to invest directly in an index. Real Estate and Housing Index definitions and holdings are available at HoyaCapital.com.
Index performance cited in this commentary does not reflect the performance of any fund or other account managed or serviced by Hoya Capital Real Estate. All commentary published by Hoya Capital Real Estate is available free of charge and is for informational purposes only and is not intended as investment advice. Data quoted represents past performance, which is no guarantee of future results. Information presented is believed to be factual and up-to-date, but we do not guarantee its accuracy. Real Estate and Housing Index definitions and holdings are available at HoyaCapital.com.
Excerpt from:
Data Center REITs: Battle Of The Clouds - Seeking Alpha
High-risk vulnerabilities and public cloud-based attacks on the rise – Help Net Security
A sharp increase (57%) in high-risk vulnerabilities drove the threat index score up 8% from December 2019 to January 2020, according to the Imperva Cyber Threat Index.
Following the release of Oracles Critical Patch Update which included 19 MySQL vulnerabilitiesthere was an unusual increase in the vulnerabilities risk component within the Index.
Specifically, there was a 57% increase in vulnerabilities that can be accessed remotely with no authentication required, have a public exploit available, or are trending in social media, meaning they pose an especially high level of risk to businesses.
Web attacks originating from the public cloud saw a 16% spike from November to December 2019. AWS was the top source of attacks, responsible for 94% of all web attacks coming from public clouds. This suggests that public cloud companies should be auditing malicious behavior on their platforms.
In the same month that the coronavirus outbreak first came to light, two new spam campaigns that relied on the hype around coronavirus were observed.
These messages lure people to enter a site that tracks the spread of the virus and also offers the sale of shady pharmaceuticals.
Despite widespread concern over the recent Citrix Application Delivery Controller bug, it was only ranked as the 176th most frequent attack vector seen this month.
For comparison, high-profile attack vectors such as this typically rank among the top 20. The Citrix bug accounted for 200,000 attacks detected, while the top attack vector in January accounted for over two billion attacks.
More than half (51%) of the attacks against the adult industry were remote code execution (RCE). The reason these attacks pose an inflated risk is because a remote attacker can run malicious code to hijack the server and access its data.
Most of the top 10 countries in which attacks originated were targeting sites within the same country. The exceptions were attackers from Germany and China who targeted U.S.-based websites.
This can be attributed in part to the fact that many websites under attack from different regions are located in U.S. data centers. This finding shows that even cyber attacks conducted by foreign adversaries often appear to originate locally.
See the article here:
High-risk vulnerabilities and public cloud-based attacks on the rise - Help Net Security
Cloud Server Market Strategies and Insight Driven Transformation 2019-2025 – News Parents
The research study provided by DataIntelo on Global Cloud Server Industry offers strategic assessment of the Cloud Server Market. The industry report focuses on the growth opportunities, which will help the Global Cloud Server Market to expand operations in the existing markets.
Request Exclusive Free Sample PDF Of This Report At https://dataintelo.com/request-sample/?reportId=86589
Next, in this report, you will find the competitive scenario of the major market players focusing on their sales revenue, customer demands, company profile, import/export scenario, business strategies that will help the emerging market segments in making major business decisions. The market contains the ability to become one of the most lucrative industries as factors related to this market such as raw material affluence, financial stability, technological development, trading policies, and increasing demand are boosting the market growth. Therefore, the market is expected to see higher growth in the near future and greater CAGR during the forecast period from 2019 to 2026.
Major Players included in this report are as follows IBMHPDellOracleLenovoSugonInspurCISCONTTSoftlayerRackspaceMicrosoftHuawei
Cloud Server Market can be segmented into Product Types as Logical TypePhysical Type
Cloud Server Market can be segmented into Applications as EducationFinancialBusinessEntertainmentOthers
To Buy this report and get it delivered to your inbox, please visit https://dataintelo.com/checkout/?reportId=86589
Cloud Server Market: Regional analysis includes: Asia-Pacific (Vietnam, China, Malaysia, Japan, Philippines, Korea, Thailand, India, Indonesia, and Australia) Europe (Turkey, Germany, Russia UK, Italy, France, etc.) North America (United States, Mexico, and Canada.) South America (Brazil etc.) The Middle East and Africa (GCC Countries and Egypt.)
The Cloud Server Market Report regulates a complete analysis of the parent market including dependent and independent sectors. The report provides strategic recommendations with the senior analysts consultation that gives a clear perspective to clients as to which strategy will help them best to penetrate a market. Further, the report sheds light on the raw material sources, organizational structure, production processes, capacity utilization, value chain, pricing structure, technologies, equipment, product specifications distribution channel, and serving segments. It demonstrates graphical information with figures and pictures for elucidation.
For More Information on this report, Request Inquiry At https://dataintelo.com/enquiry-before-buying/?reportId=86589
Key Highlights of This Report: The report covers Cloud Server applications, market dynamics, and the study of emerging and existing market segments. It portrays market overview, product classification, applications, and market volume forecast from 2019-2026. It provides analysis on the industry chain scenario, key market players, market volume, upstream raw material details, production cost, and marketing channels. The growth opportunities, limitations to the market growth are identified using the SWOT analysis It conducts the feasibility study, explores the industry barriers, data sources and provides key research findings The report delivers analysis on consumption volume, region-wise import/export analysis and forecast market from 2019-2026.
For Best Discount on purchasing this report, Visit https://dataintelo.com/ask-for-discount/?reportId=86589
About DataIntelo: DATAINTELO has set its benchmark in the market research industry by providing syndicated and customized research report to the clients. The database of the company is updated on a daily basis to prompt the clients with the latest trends and in-depth analysis of the industry. Our pool of database contains various industry verticals that include: IT & Telecom, Food Beverage, Automotive, Healthcare, Chemicals and Energy, Consumer foods, Food and beverages, and many more. Each and every report goes through the proper research methodology, validated from the professionals and analysts to ensure the eminent quality reports.
Contact Info DataIntelo Name Alex Mathews Email [emailprotected] Website https://dataintelo.com Address 500 East E Street, Ontario, CA 91764, United States.
Continue reading here:
Cloud Server Market Strategies and Insight Driven Transformation 2019-2025 - News Parents
Executive interview: Making IT sustainable – ComputerWeekly.com
In October 2019, the Department for Food and Rural Affairs (Defra) Sustainable technology annual report 2018 to 2019 reported that cloud-first and digital agendas, policies and strategies have led to the closure of inefficient on-premise datacentres.
But as departments adopt more efficient cloud, private cloud or colocated datacentres, the operators must become more transparent in terms of sustainability, saysSusanne Baker,who is responsible forTechUK'sclimate change programmes. Baker believes there is now greater emphasis on cloud providers to demonstrate how efficient they really are.
According to Greenpeace, since 2010, when it first started reporting on datacentre energy use, more than 20 of the largest internet companies including Facebook, Google and Apple have established public commitments to power their digital infrastructure with 100% renewable energy.
Last year, the environmental group claimed that unlike other leading IT firms that have adopted 100% renewable energy commitments,Amazon has remained notoriously opaque when it comes to publicly reporting information about its current energy use.
The Defra sustainability report found that while government departments are indeed lowering the carbon footprint of their IT estates, the increasing use of cloud providers by public sector organisations makes it far harder to calculate their overall carbon footprint.
There are high-level protocols to assess the carbon footprint of outsourced cloud services, says Baker.
But as every organisation becomes digitised and consumes more cloud services, she says it is now extremely hard for businesses to account for the carbon emissions these cloud-powered services create.
In a datacentre, servers tend to use power continuously because they operate 24/7. According to Baker, server lifecycle carbon impact is heavily dominated by the applications being run. As such, it is environmentally good practice to replace older servers with new ones on a regular basis. This runs contrary to perceived wisdom that assets should be sweated for as long as possible.
In Bakers experience, servers over five years old are unlikely to contribute to an efficient IT operation. She says third-party providers are generally incentivised to optimise refresh rates, compared with in-house operations, where the datacentre is not run as a business unit.
The large, hyperscale operators may replace the central processing units as frequently as every 12 months, but two to three years is more common practice, she adds.
Services that run in the cloud consume processor cycles, storage and network bandwidth. These parameters can be measured, as they are used for billing. But, according to Baker, an accurate assessment of how green a particular service is very complex due to the highly distributed nature of cloud computing. This is a problem for the government and large enterprises as they attempt to meet sustainability targets.
For instance, in the past, a traditional media company may have been able to measure its carbon footprint by following its supply chain to assess the impact of printing newspapers and distribution. But, as media firms have moved to the cloud, it is now far harder to understand their carbon footprint.
Baker says an executive at a traditional media company recently told her that now the product is in the cloud, the company does not have the faintest idea of its carbon footprint.
Unlike when newspaper and magazine printing was outsourced to a printing company that needed to have its own sustainability reporting, it is difficult to gather evidence in the cloud. Not only do the media companies that wish to calculate their carbon footprint now need to understand how much energy is being used to host their digital media services, they also need to count the carbon footprint impact of the readership accessing those services.
And here lies another issue organisations face as they try to drive down their carbon footprint in the drive for zero emissions. Since Moores Law was coined in the late 1960s, the industry mantra has been to offer more for less: more processing power, which doubles every 18 months, according to Moores Law; more bandwidth; and more storage for the same cost.
This has driven up consumption of digital products and services people are enticed to use more, because the industry claims this is better and does not cost anything more. For instance, standard-definition video is now defunct. Its successor, high-definition (HD), is quickly being replaced online by Ultra HD video. And such content consumes more and more bandwidth. According to the recommended bandwidth for Googles Stadia stream gaming service, 10Mbps is recommended for 720p standard definition, 20Mbps for HD and 35Mbps for the best experience in Ultra HD.
If you had to pay to upload a photo to Facebook, there would be outrage. There is no information on the energy impact of the photo Suzanne Baker, TechUK
While Stadia and other premium internet services such as Spotify and Netflix charge for higher quality streaming, requiring more bandwidth, much of the internet is free. Baker says consumers need to appreciate the environmental costs of these supposedly free services. While the younger generation prefers on-demand video services and YouTube, broadcast has a way lower carbon footprint, she says.
But there is no going back to the era when families would all gather in front of the TV to watch a broadcast. Instead, consumers are being offered higher and higher definition video streaming.
How do you communicate to the consumer that high-definition video is very byte-intensive? There is an impact at every stage, says Baker.
For example, if someone creates an ultra-high-definition video, uploading the file will consume some network bandwidth, and it will use significantly more storage than a standard-definition video, but, says Baker, what happens if the video goes viral and is downloaded by 800,000 people?.
Just as with high-definition premium internet services, each download will need to be processed in datacentres running servers, network switches and gateway servers, somewhere on the global internet.
For some applications, the consumer is likely to make a conscious effort not to use the ones that are less efficient. For instance, Baker says smartphone apps that are most energy intense are the ones that get weeded out.
But she says the advertising-based business model, where producers of content and consumers are not charged, does nothing to promote or encourage energy efficiency. If you had to pay to upload a photo to Facebook, there would be outrage, she says. There is no information on the energy impact of the photo.
However, Baker believes the Defra report, which was produced in conjunction with the HMG Sustainable Technology Advice and Reporting team, will help to drive greater transparency in the datacentre operators market. I am seeing change influenced by the government, which has net zero emissions goals, she says. There is a greater level of scrutiny.
Thegrowing pressure for datacentre and cloud operators to provide transparent reporting on sustainability could benefit both businesses and consumers. For instance, in the government department, Baker says green ICT is becoming a tender requirement. The sustainability credentials of the providers is part of the decision-making process.
As organisations assess what to outsource, and whether to use cloud infrastructure or a cloud platform and in-government department, green ICT is becoming a tender requirement for them too.
The companies that ask questions on sustainability are very big customers, and this will apply pressure, says Baker. I would hope that in five years there will be much higher levels of sustainability reporting.
Read the original:
Executive interview: Making IT sustainable - ComputerWeekly.com
Save $484 on Dell’s PowerEdge small business server with this promo code – ZDNet
Dell is holding a sale where you can save up to 50% off select PowerEdge servers for small businesses. There are quite a few discounted models available, but we spotted one deal in particular that works with acode, bringing the price down another $300.
Disclosure: ZDNet may earn an affiliate commission from some of the products featured on this page. ZDNet and the author were not compensated for this independent review.
The Dell PowerEdge T40 tower server can be the "building block" for your small service, according to Dell, as it can handle common workloads such as file consolidation, storage, and sharing. It comes with a 3.5-inch chassis with up to three hard drives. It also features an Intel Xeon processor, 8GB of memory, and 1TB of HDD storage.
This small business server is normally $833, but it's been reduced as part of Dell's PowerEdge server sale, and if you apply the promo code PD349SERVER at checkout, you can cut an extra $300 from the price, bringing the total to $349. That's a $484 savings.
Dell's PowerEdge T40 is a well-rated, dependable, on-site tower server that you won't regret investing in to support your small business. You can use it to locally manage your files and expenses while also avoiding the hassle of cloud processing and storage costs. It even comes with Dell EMC support, as well as a one-year hardware warranty.
Original post:
Save $484 on Dell's PowerEdge small business server with this promo code - ZDNet
Evolution of Infrastructure as a Service – App Developer Magazine
Infrastructure-as-a-Service (IaaS) has changed the world. In the past, a company had to invest time and resources into building and operating its own servers. So much effort went into maintaining infrastructure, that little time remained for focusing on the company mission. IaaS changed all that. Today, a provider like AWS will build the server farm. A software company will use those servers to build applications, and their customers will leverage both of those tools to develop their own products.
IaaS has changed the world because the best thing for the market is equilibrium. Equilibrium can only be achieved when every company specializes in what they do best. Cloud infrastructure enables a focus on core competencies. When organizations are able to focus on what they do best, they can deliver focused, better products, which in turn better serve end customers needs.
Dudai: The aaS acronym refers to a cloud solution for business that is provided as-a-service. Infrastructure-as-a-service (IaaS) more specifically refers to self-service, pay-per-use storage space, networking equipment, and services. It is highly scalable, automated, self-provisioned and gives users far more granular control over their environments.
IaaS has evolved entire industries because it enables companies to deliver a more focused, better product to customers. It allows companies to build specialized technology stacks that help them do what they do best, ultimately differentiating them from the competition.
Dudai: Some would say that IaaS, SaaS, and PaaS are part of a family tree. SaaS is one of the more widely known as-a-service models where cloud vendors host the business applications and then deliver to customers online. It enables customers to take advantage of the service without maintaining the infrastructure required to run software on-premises. In the SaaS model, customers pay for a specific number of licenses and the vendor manages the behind-the-scenes work.
The PaaS model is more focused on application developers and providing them with a space to develop, run, and manage applications. PaaS models do not require developers to build additional networks, servers or storage as a starting point to developing their applications.
Dudai: When first introduced, IaaS initially provided mostly compute (virtual servers such as ec2) and object storage (such as s3). Today it has matured to provide a much richer set of managed services from databases, to containers, to serverless functions, to message queues and more. This has enabled companies to leverage more and more capabilities in the cloud and spend more time differentiating their product.
Dudai: IaaS is now enabling more disruption across all markets and industries as the same capabilities available to larger companies are now also available to the smallest startup in a garage. This includes advances in AI and Machine Learning (as a service), data analytics, serverless technologies, IoT and much more. This is also requiring large companies to behave as agile as a startup.
Dudai:IaaS enables businesses to deliver a more well-honed, better product to its customers. IaaS is like a microeconomy, where supply equals demand, creating an equilibrium. With IaaS, everyone in the market is specialized and can do their best work. IaaS enables companies to build the best app that focuses on solving problems for their customers rather than spending resources spinning up and managing their own infrastructure.
Dudai: This balance opens the door for broader, deeper, richer and more robust products and services, with faster turnaround time and faster time to market.
Dudai: Without it, companies run the risk of falling behind and losing business to their more agile competition.
Sagi Dudai is Chief Technology Officer. In this role, he is responsible for leading Vonages technology vision, architecture, and design, overseeing all aspects of technology development, including new products, their enabling technologies, and R&D. Prior to being named CTO, Mr. Dudai was Senior Vice President of Software Engineering for Vonage, responsible for software development company-wide.
Mr. Dudai has more than 20 years of experience at the forefront of the fastest moving technology industry trends, including cloud, mobile, machine learning and artificial intelligence (AI). Prior to joining Vonage in 2012, he held engineering leadership roles at various technology companies, including three startups Mercury Interactive, TelMap & fring which were later acquired by Hewlett-Packard, Intel and GenBand, respectively. Earlier, he also worked on a number of classified software and hardware engineering projects for Israeli intelligence.
He graduated from the prestigious IDF computer science training program and earned his B.A. in Computer Science & Business Administration from Tel Aviv University. He also holds an Executive M.B.A from Northwestern Universitys Kellogg School of Management.
Go here to see the original:
Evolution of Infrastructure as a Service - App Developer Magazine
Turn government paperwork into efficient workflows heres the proof – ITBusiness.ca
For the people they serve, governments want accountability and transparency. The realities of bureaucracy mean decisions need input, review and signoffs from multiple stakeholders. Thats understandable, but the process consumes forests.
There are complex approval chains in government that are totally locked in on paper, says Jared Kolb, director of partnerships at Proof, and head of the firms Toronto office.
Why the continued reliance on paper processes? A lot of inertia is built up in the system, says Kolb.
Thats changing. As governments are embracing innovation, theyre looking towards service-as-a-software organizations to re-imagine their business operations. Proof (proofgov.com) is filling the urgent need to help governments go paperless.
Proofs technology is designed for the approval process of the government. The companys intuitive web interface allows public servants and government officials to build and customize routings specific to their workflows.
Users attach documents on secure cloud servers and can share, search and retrieve information across multiple locations and repositories. Proofs workflow management software provides automated audit trails, which include approval sequences, document revisions and timestamps.
Approvals can be signed at the click of a button. With Proof, e-signatures are legally binding and unique to each user.
Proof digitizes citizen-facing and internal forms to ensure information is collected and stored seamlessly. Requests from submitted forms can be assigned to team members, with the status of the forms quickly tracked.
Kolb estimates that administrators using Proof save an average of one hour a day. For a 100-FTE department, Proof calculates a savings of $2.2 million in staff time. When you multiply this out, there are serious savings in time and ultimately dollars, says Kolb.
Proof aims to increase efficiency in government. Users can more closely monitor and prioritize approvals. Proof provides real-time analytics on workflows in all departments. Dashboards reveal the volume and flow of teams, which helps leaders to make decisions and improve processes.
The data can be unlocked, says Kolb. Once we digitize these processes, we begin to understand where the bottlenecks are.
Proof was co-founded by CEO Ben Sanders (based in Victoria), Chief Operating Officer Luke Decoste (based in Halifax), and Chief Technology Officer Wes George (based in Whitehorse). Between them, they have experience in software and engineering, leading startups and working with various government bodies.
Proof did its first pilot with the Yukon government in 2018 and is now working with governments at all levels, from the City of Winnipeg to Service Nova Scotia, to Transport Canada.
Kolb is excited to lead the Proof effort in Toronto. For one, theres the market potential: Theres an open posture towards our approach; we share a mind with many of the governments across Ontario.
Beyond that interest, Kolb says, Toronto is Canadas tech hub, and theres an immense amount of talent in the Toronto startup community.
Proof was part of the 2019 class of theTechstars TorontoAccelerator, whose goal is to help entrepreneurs succeed through access to mentorship, talent and tech support.
That culminated in a successful fundraising round, which enabled us to expand our team, says Kolb.
As Proof grows, it continues to help governments keep an eye on processes, to see where things are moving through the system or falling behind. Our goal, says Kolb, is to make government better.
Go here to see the original:
Turn government paperwork into efficient workflows heres the proof - ITBusiness.ca
Locking Down the Kernel and Securing the Container – Container Journal
Containers have taken the place of virtual machines as the go-to technology if multiple programs are required to run in isolation of one another on a single server. Virtual machines made processes such as cloud computing and web hosting possible. With virtual machines, the operating system and program code are stored together and any single operating system is running on the assumption that it is doing so on its own server. In reality, it is sharing a single server with several other operating code-program packages. This is made possible by the host operating system. If this sounds confusing, think of it this way: Several virtual machines are running independently of one another on the same server.
While this approach solved many problems in computing, it also has some major issuesnamely, the processing overhead required to run numerous emulated servers. Most laptops today are not capable of running multiple virtual machines efficiently.
This is where container technology has stepped in. Container tech has become extremely popular over the last decade with large corporations such as Amazon and Google jumping on board, says Chael Anderson, a tech writer at Australia2write and Nextcoursework.
To mitigate the heavy computer costs often associated with virtual machines, containers have only the application and the necessary namespaces. This program interacts directly with the host operating system. To make it clear, the only operating system on the server is the host server interacting with the containers. To use containers, something called a kernel is required.
A kernel acts as a middle man between the operating system and a container. In fact, many containers may share a single kernel. The kernel limits what programs can access what data. For example, it may not be desirable that program A has full access to the data and information stored within program B, even though they will need to share information at some point. The operating system, on the other hand, has full access to all data in kernel mode (when all memory is accessible).
Any compromise to kernel security can lead to major issues in regards to sensitive data. Although container technology has made computing less costly than the previous virtual machine approach, it has created a new type of security concerns. writes Josh Playfair, a web developer at Britstudent and Writemyx.
Next, we will look at some things to keep in mind when locking down a kernel to secure the container.
Failure to lock down a kernel can result in a wide variety of issues, the most damaging being a malicious actor gaining access to parts of a container they were not meant to. Here are some best practices for preventing this.
Keep the kernel updated: The kernel should be updated to the newest version as soon as the host is created. The issue is not usually with the kernel itself, but rather with containers. It is not uncommon for containers to have vulnerabilities. Although these vulnerabilities are usually resolved quickly, failure to update to the latest version will still leave the container exposed to these issues. Checking if one has the latest kernel installed is simple; it requires running the code shown below:
Use only SSH authentication and remove root user: The purpose of doing this is simple: If a container escapes isolation we do not want to special privileges of the root user exposed to the host.
Furthermore, SSH authentication passwords are, by design, insecure and best disabled. To disable SSH authentication passwords execute the following code:
Use container security tools: Many quality container tools perform scans and alert the user if there is a security issue with a container. Docker is one of the most popular programs that offer this service.
The three tips listed above are some of the most simple yet effective ways to lock down kernels and ensure container security. That being said, no container is ever fully secure or safe from attack. It is important to always monitor. It is also recommended that those concerned with container security look into host security as well. These two topics go hand in hand concerning information security.
Related
Read this article:
Locking Down the Kernel and Securing the Container - Container Journal
13 Cloud-Based Services Every Tech Department Should Invest In – Forbes
Cloud-based services are becoming more and more commonplace, and its not hard to see why. The cloud can save companies time, money and frustration and offer more flexibility for employees to access data remotely.
Whether your business is just beginning its cloud journey or looking to upgrade to the latest and greatest cloud solutions, its important to keep up with what industry leaders consider to be must-have services. Thats why we asked a panel of Forbes Technology Council members which cloud-based services they believe every tech department should be using. Their best answers are below.
1. Slack
Though every company has its own specific requirements for services, communication is the key across all departments. Effective communication between employees, ease in sharing documents and files, seamless integration with enterprise applications, and private and public communication channels are key capabilities for efficient business operations that are all provided by Slack. - Sujeeth Kanuganti, Aira Tech Corp.
2. G-Suite
One of the best cloud-based services for productivity and collaboration is G Suite. Were a remote company, and G Suite apps like Google Docs, Google Sheets and Google Drive make it easy for our teams to collaborate on projects. We can use it to create and share company documentation as well. Plus, you can get started with G Suite for free, which is great for startups and small businesses. - Thomas Griffin, OptinMonster
3. Zoom
The cloud has revolutionized the way businesses operate, and communication is no exception to this. As a tech team thats partially remote, weve gravitated towards Zoom for both video and audio conference calling. It gives us the ability to quickly call meetings, which in turn allows us to operate more efficiently as a unit, no matter how far apart we are from one another. - Marc Fischer, Dogtown Media LLC
4. Expensify
Get rid of those nasty receipts and use Expensify. Its been a lifesaver for us as a company as we continue to grow. You can do everything through the app, even online expenses. Just take a screenshot of a receipt and it automatically uploads it into an expense report. It saves the thousands of hours of time and energy that we were spending on those expense reports. - Christopher Carter, Approyo
5. The Public Cloud
Every IT department is managing a plethora of enterprise applicationsmostly old onesbuilt before public cloud software was available. To be a modern IT group, they should be moving as much as they can to the public cloud. They should use the public cloud to rapidly prototype, experiment with new functionality and add new capabilities like machine learning and analytics to old applications. - Danielle Royston, Optiva Inc.
6. Communications And Help Desk Tools
The key is the word, can. Every time we run the numbers on the cloud for what we currently do and offer, the cloud is always more expensive and the risks of vendor lock-in are huge. Consequently, we look to the cloud when replacing systems or introducing new systems and services. We use the cloud for communications technology, help desk technology and software-development-management tech. - Richard Davis, Katalyst Data Management
7. Services That Support Your Core Capabilities
Your core capabilities should weigh in the most on the decision. The rest should be cloud-basedconsider tradeoffs in latency, availability, modularity, tech availability and maturity, security, and cost. Say you offer a high-throughput, high-volume computation requiring tailored hardware with a non-standard security protocol to support your client offeringsit might be best to handle it on-premise. - Florian Quarr, Exponential AI
8. Employee Communication
Any service that is involved with communicating to employees should be cloud-based and hosted off-site. This way there are no outages related to power or network issues, and it should be safe from disasters and storms. One example is email, but there are of course others. - Seth Wasserman, Menin Hospitality
9. An AOIPs Platform
Every IT department needs to invest in and standardize on an AIOPs platform that automates and quantifies the effectiveness of IT technology from end-to-end. These platforms help automate costly and cumbersome data analysis. They also identify the root cause of issues impacting device performance and security problems by constantly staring at the network from client to cloud and everything in between. - Abe Ankumah, Nyansa
10. Internal Messaging Tools
The one indispensable cloud service today is ad-hoc communications and messaging with products like Slack, Facebook Workplace or Microsoft Teams. These are all software as a service offerings that have become part of many startup and enterprise organizations. If an organization is still using email as their primary intercompany communication mechanism, they need to start moving to one of these productivity tools. - David Torres, Feedme Inc.
11. Cloud SSO And Password Managers
When adopting the cloud, all tech teams should have an identity strategy. Cloud single sign-on systems like Okta, Azure AD or Auth0 are convenient for users to access their cloud services and maintain a point of control for IT. Even when these SSO services are not in use, tech teams should use cloud-based password managers such as LastPass or even the password manager built into Chrome. - Steve Pao, Hillwork, LLC
12. Cloud-Based Linux Servers
If you havent moved your serverssuch as Web hosting, VPN, etc.then you are behind the times. Cloud computing is already very competitive in terms of cost. The flexibility and scalability of a cloud-based Linux server is tremendous, especially when taking into account downtime and practicality. Sometimes cloud-based Linux servers even cost less than their dedicated counterparts. - WaiJe Coler, InfoTracer
13. Digital Data Twins
Tech departments managing a mix of on-premise and off-premise public/private cloud should have an externally located, full cloud-based mirror (digital twin) of all data assets. This is for disaster recovery and business survival purposes, as well as to test the risks and potential of the public and private cloud. Organizing all data for mirroring will itself identify opportunities for efficiency and productivity. - Michael Gurau, Kaiser Associates, Inc.
Here is the original post:
13 Cloud-Based Services Every Tech Department Should Invest In - Forbes