Category Archives: Cloud Computing
Shaping the Dallas Data Center Market: Supply and Demand Trends – Data Center Frontier
The data center development activity in the Dallas region suggests growing confidence in the future growth of demand in the Greater Dallas market. (Image: Digital Realty)
Last week we launched a special report series on the Dallas data center market. This week, well look at trends in market supply and demand in the Dallas/Ft. Worth region.
One of the primary factors driving data center activity in Dallas is the healthy economy and the growing list of large businesses, which generates demand for digital infrastructure.
Data center requirements in the DFW market come from companies both in and out of the market. The area is home to the corporate headquarters of 18 Fortune 500 companies including Exxon Mobil, JC Penney, AT&T, Texas Instruments, and others. State Farms data center in Richardson, just south of their 2.5 million SF regional office campus, is a good example.
In addition, a portion of the data center activity has come from companies tasked with upgrading aged data center infrastructure within an owned facility. Instead of reinvesting in the existing operation, many of those companies have chosen to house their infrastructure with colocation providers, fueling development in the area. This trend is pervasive across all major markets.
Companies from outside the Dallas market view the area as strategic for multiple reasons, including its central location. For enterprises that have data centers in primary coastal markets such as Northern California or New York City, the DFW market is a logical location to be in the center of the United States. In addition, the steady colocation supply delivered over the past few years puts DFW in a position to compete for large, national projects. Larger colocation providers completed record transaction sizes in DFW over the past few years, with some being as large as 10 MW of critical load.
The DFW market has been the beneficiary from the growth of cloud computing as well. Cloud providers, including IBM Softlayer and Rackspace have placed their infrastructure with larger colocation providers. For instance, at Digital Realtys 68-acre Richardson campus, a major portion of the 86 MW at the campus is accounted for by cloud providers. Additionally, hyperscale cloud companies are actively developing their own data centers and leasing capacity throughout the region. Its anticipated this trend will continue, as data center users further embrace cloud computing and cloud providers rely on larger colocation providers for infrastructure support.
From a demand perspective, the DFW market averages approximately 30 MWs of net new growth per year. This growth traditionally comes from financial, technology, managed services/cloud, telecommunications, and the healthcare industry. To handle the anticipated demand, several providers have announced expansions and/or entrances into DFW.
Its anticipated that pricing will remain aggressive in the DFW market given the amount of supply and new companies that are entering the market.
Development in the Dallas market is driven by major wholesale data center providers, who have been creating multi-building campuses to support long- term growth. Colocation and hyperscale players are also eyeing future capacity, adding up to massive planned capacity of 811 megawatts (5.5 million square feet) in Dallas/Fort Worth.
Nearly all of that is for future runway. There were just 11.25 megawatts of data center capacity under construction in 2Q 2021, representing 60,000 square feet. Thats an increase from just 2 MWs in 1Q 2021 and 4 MWs in 4Q 2020. The COVID-19 pandemic has been a factor in the construction slowdown, but some other markets (most notably Northern Virginia) saw building booms as the pandemic prompted major shifts to online delivery models for work, learning and entertainment.
The primary factor in the subdued pace of new construction is the inventory outlook in Dallas. There was 51 MWs of commissioned capacity available in 2Q 2021, while the leasing over the previous 12 months absorbed about 32 MWs, with 3.3 MWs coming in the second quarter, according to datacenterHawk. The vacancy rate was 11.96% in 2Q.
This is remarkably similar to the supply picture at the beginning of 2018, when the Dallas market had 50 MWs of available capacity, and 11 MWs under construction only with a higher vacancy rate of 15%. This history is important, reflecting a disciplined approach to expansion by providers in Dallas/Fort Worth, where most of the leading players have significant experience operating in the region.
This ample supply of available space, along with the number of providers in the market, creates a tenant- friendly buyers market. But several recent deals have begun to change that dynamic, as the large cloud and SaaS (software-as-a-service) players gobble up data center space in large chunks.
The key question is whether hyperscalers will build or buy. Some cloud platforms and SaaS providers have leased wholesale space in Dallas. But the largest deployments have been company-built campuses for Facebook in Fort Worth and Google in Midlothian, 25 miles southwest of Dallas.
Digital Realty has also sourced solar power from Pattern Energys 82.5 MW Phoenix Solar Project in Fannion County Texas to support its data center portfolio in Greater Dallas.
The regions data center capacity is spread across several sub-markets, including major carrier hotels in downtown Dallas, a nexus of wholesale data centers in the Telecom Corridor in Richardson, Plano, Garland and Allen. These northern suburbs are emerging as key to the regions data center supply, offering more land for the multi-building campuses that developers covet.
Fort Worth is about 30 miles West of Downtown Dallas, and has emerged as a sub-market to watch in the wake of Facebooks decision to build a data center campus in the Alliance Gateway business park.
Heres a look at some of the recent developments shaping the data center market in Dallas/Fort Worth area.
Download the full report, Dallas Data Center Market, courtesy of Digital Realty to learn more about this competitive data center market. In our next article, well the explore the regions business environment, including power, disaster risk, tax incentives, and connectivity. Catch up on the previous article here.
Go here to read the rest:
Shaping the Dallas Data Center Market: Supply and Demand Trends - Data Center Frontier
Powering the Next Era of Cloud Services – Rapid launches Cobolt with exciting new features and limitless possibilities for your business – TechJuice
Earlier this week RapidCompute, Pakistans largest local cloud service provider, announced the formal launch of Cobolt (Powered by OpenStack), a platform that is aimed to provide the customers with tools that manage core cloud-computing services of compute, networking, and storage in a much faster and efficient manner. With the launch of this new product, they have also introduced a new customer portal Portal 3.0, which allows the customer to perform multiple operations that come with new product offerings as part of Cobolt.
Rapid has always considered itself to be a pioneer in cloud computing trends and services in Pakistan. Keeping in line with this approach, the organization has introduced some advanced features available on leading global clouds that enable the customer to manage their infrastructure through a simple and intuitive interface.
Portal 3.0 was developed with a very clear objective in mind; to provide users with complete control of their computing resources, the ability to create clusters and deploy multiple nodes in a complex environment with just one click. It is completely optimized for use on web browsers and iOS/Android tablets and phones for ready access to critical services on the go.
The portal was built with a modular approach to provide flexibility and agility. Integrating some of the globally used and locally sought products such as:
All of these products boost business agility, availability, and efficiency. It will not only give your IT staff better access to IT resources but will make it very convenient for developers to provision machines rapidly on-demand. Faster deployment of resources also means business units can roll out and complete projects earlier than before.
According to Mr. Shahzaib Khan, Commercial Head RapidCompute:
The portal is an easy way to deploy resources in complete harmony to the needs of your business. It enables the management to drive transformations required for the business to innovate faster and build a sustainable competitive advantage. In the age of customer, it is not just the enterprise sector that can benefit from OpenStack but also startups that are working on bringing new and innovative products to the market.
He also mentioned how the company engages with the customers for feedback which allows them to understand better what the customers seek. He also highlighted the cutting-edge features that RapidCompute provides, unlike any other cloud provider in Pakistan.
The edge we have of being a local cloud service provider is that our teams understand the needs of this market and are always working to bring the best practices and popular features used by international cloud providers.
On Cobolt (Powered by OpenStack), companies can adjust their infrastructure according to the needs of the business. OpenStack enables you to provision machines on-demand; they can significantly reduce development and testing periods and have more freedom to experiment with ideas. It is a platform that suits businesses of all sizes; it is designed to be ready for scaling, whether thats scaling up or down.
Imtiaz Ahmed Khan, Chief Technical Officer, Rapid, commented,
Being the pioneer of Cloud Computing in Pakistan, Rapid is constantly working towards integrating new products and cutting-edge features. With Cobolt, we are the first Pakistani cloud provider to be offering Magnum (Kubernetes as a Service) and Trove (Database as a Service). The launch of Portal 3.0 is just the beginning. Soon, youll be hearing about the addition of many more products to the Cobolt ambit.
Rapid is currently the largest public and private cloud provider in the country and recently celebrated a decade serving Pakistani enterprises. It has been Pakistans trailblazer in offering cloud computing, networking services, and management tools for digitizing all areas of businesses and industry.
Rapid continues to be one of the largest providers of cloud services to the government, financial sector, and large-scale enterprises. For procuring Rapids cloud services you can get in touch at sales@rapidcompute.com or info@rapidcompute.com.
Follow this link:
Powering the Next Era of Cloud Services - Rapid launches Cobolt with exciting new features and limitless possibilities for your business - TechJuice
Riding high on rapid cloud adoption, India’s SaaS sector is all set to score big in 2022 – Economic Times
Technology and digital transformation has been the key buzz word since a couple of years now, with large to small scale businesses shifting their operational activities to cloud and online for better productivity and ease of business activity. Owing to the pandemic and the lockdown that took into effect, working remotely became a necessity and organizations across sectors and sizes adopted to the new norms of remote working and digitization in their processes and offerings.
With growing uncertainty of the times and the likely realities of the "new normal," more and more organizations are now charting the course for transforming and moving towards cloud computing and digitization. Even looking at the global counterparts, Microsoft CEO Satya Nadella said that the company had seen two years of digital transformation in two months as even their customers had started adopting cloud solutions.
The global pandemic has somewhere pushed and accelerated the rate of adoption of cloud computing. It has acted like a catalyst and enabled flexibility with respect to cloud computation and acceptance. Gartner, in its research report stated that the worldwide end-user spending, post the pandemic, on public cloud services is forecast to grow 18.4% in 2021 to total $304.9 billion.
In addition to digital services and cloud computing, it has been projected that the growth of Indias SaaS industry holds immense potential. A report by McKinsey and SaasBoomi have predicted that this industry will be worth $1 trillion by 2030.
SaaS companies like Freshworks and Salesforce are just two of the many in this space. Freshworks became the first Indian SaaS company to list on Nasdaq on 22nd September21 and raised over $1 billion, with a market valuation to $10 billion. Freshwater IPO acted as a great deal in opening up doors for many other start-ups in the same space.
The Indian SaaS space has been growing rapidly, with projection expected to grow at ~30% CAGR over 202025 and double their share in the global market to 8%9% by 2025. India now has 13 SaaS unicorns as compared to one in 2018, with India being the third largest SaaS ecosystem globally, after USA and China.
However, I feel that overall, the rise of software as a service solution isnt going anywhere. Businesses from all sector and services are currently utilizing SaaS in some shape or form and as these options are only rising and would witness an exponential increase.
(Pradeep Gupta is Co-Founder & Vice Chairman, Anand Rathi Group)
Continue reading here:
Riding high on rapid cloud adoption, India's SaaS sector is all set to score big in 2022 - Economic Times
Beyond the Cloud: Five Incoming Trends for 2022 Featured – The Fast Mode
What does the future of data networking look like in 2022? If the past two years are any indicators, we are entering a challenging year where supply chain issues, the impact of unforeseen variants, and a recovering global economy will set the stage.
Next year will be about continuing to build resiliency and cloud native computing to the edge. Increased demands from emerging applications from industrial IoT to virtual and augmented reality, autonomous vehicles and even the metaverse mean that we will see increasing innovative technologies with high bandwidth and low latency requirements that will impose on network buildouts. Were now a few years into the 5G era, and there is still a long road to travel before we can fully realize the benefits of 5G. Here are some predictions:
#1: Convergence of fixed and mobile for 'seamless services'
The intersections between the fixed network and 5G mobile networks will be a key component in the success of 5G deployment in 2022 and beyond. These are the interfaces between the 5G core, the wireline and wireless access networks, hybrid access nodes, and gateways.
3GPP is the 5G standardization buddy, working jointly with Broadband Forum to ensure that 5G packet core works for not only 3GPP access but also non-3GPP access such as Wi-Fi 6, fiber and so on. This means that a single 5G packet core infrastructure will be capable of sustaining all the main access technologies in the marketplace. Thus Wi-Fi 6 and 5G could be combined into a single radio network pillar for some larger venues, creating a more seamless experience, allowing Wi-Fi and mobile devices to connect to a single radio network based on 5G technology. The possibilities here are exciting.
Work towards 3GGP standardization, Release 18, will also begin in mid-2022. The fourth standard for 5G, dubbed "5G Advanced, " includes major enhancements in artificial intelligence and extended reality enabling highly intelligent network solutions to support a wider variety of use cases and more intelligence into wireless networks. These enhancements will provide the foundation for 5G manufacturing and industrial IoT applications.
#2: Cloud computing meets edge computing
We will continue to see the emergence of new applications hungry for ultra-reliable low latency communications applications from the connected car to interactive gaming, industrial robotics to the metaverse and more.
All these applications require latency below ten milliseconds. The public cloud is currently only capable of sustaining latency radically higher than ten milliseconds. The need for lower latency will ignite more enthusiasm towards Edge computing, bringing compute power closer to the devices that consume data and services. As a result, edge computing will continue to be a growing phenomenon next year, seeing more expansion to the edge.
#3: Need for more distributed cloud
The moment you combine large centralized public cloud with smaller edge data centers leads to the need for the distributed cloud. Distributed cloud is a public cloud computing service that lets you run public cloud infrastructure in multiple locations and manage everything from a single control plane.
Distributed cloud is the foundation of edge computing, viewed as creating a slice like a virtual data center that can span across multiple physical geographically distributed data centers, which allows service providers to deploy applications consistently regardless of where the physical application resides. This concept plays a key role in providing high availability services as we move forward.
#4: Migration towards 5G cloud native applications
The bulk of new applications will all be cloud native. It will be based on a DevOps environment, integrate continuous development, and be built using microservices over containers rather than traditional virtual machines or bare metal servers. 5G cloud native infrastructure is a key component and entirely necessary to reduce the overall costs of these services and applications.
#5: Rise of 5G private infrastructure
We will see growth in the deployment of private 5G networks next year in manufacturing facilities, airports, stadiums and 5G private networks at corporate headquarters. These will be deployed either over licensed or unlicensed spectrum and are fundamental for the success of 5G. More than 90% of data consumed by devices or users while indoors could be from a private 5G network; thus, it is imperative to ensure that with 5G, we get good coverage.
The 5G Core is the heart of a 5G mobile network. In 2022 momentum for standalone 5G rollouts will continue, but there is still a long road to go. A recent survey of mobile operators by Heavy Reading in October 2021 revealed that 49% of operators plan to deploy 5G SA within a year and that a further 39% plan to deploy 5G SA within one or two years.
Standalone 5G is ten times faster, supports 10,000 times more network traffic and can handle 100 times more devices than 4G networks while enabling one-fiftieth the latency with zero perceived downtime for near-real-time responsiveness. It can also support massive numbers of devices, faster and more agile creation of services and network slices, and improved SLA management support within those slices.
With specialized machine-to-machine communication protocols and many emerging applications, massive IoT is waiting for 5G infrastructure to deploy. The pandemic and the challenges to the supply chain and delays in 5G spectrum auction may have set us back 12 to 18 months, but the overall size of the opportunity has not changed. Instead, we see a shift in time. Like our best-laid plans, we must reset our expectations and continue the course. As we have seen in previous generations of mobile, our vision for the future of data networking beyond the cloud must be long-term and sustainable.
Original post:
Beyond the Cloud: Five Incoming Trends for 2022 Featured - The Fast Mode
Paradigm4 Joins the Tetra Partner Network to Leverage Tetra Data into High-performing Analytical Computing Applications for Pharmaceutical Customers -…
BOSTON, Jan. 18, 2022 /PRNewswire/ --TetraScience, the R&D Data Cloud company, announced today that Paradigm4, an integrated scientific data analytics company, has joined the Tetra Partner Network (TPN) so that pharmaceutical customers can accelerate research decision-making using Tetra Data as a foundation for analytical computing.
"We're delighted to welcome Paradigm4 to the Tetra Partner Network so that customers can benefit from our complementary expertise in extracting full value from a range of scientific data sets," says Simon Meffan-Main, Ph.D., Vice President of Product for TPN. "Together we offer a seamless bridge from Tetra Data into Paradigm4's high-performance analytical computing platform so that customers can streamline laboratory processes and speed up research decision making."
The Tetra R&D Data Cloud ingests raw scientific data from disparate sources and engineers it into the industry's only universally adoptable format,Tetra Data, which is harmonized, compliant, liquid, and actionable. Tetra Data uniquely accelerates and improves scientific outcomes. Tetra Data can easily be incorporated into Paradigm4's life sciences solutions, which enable sophisticated analysis of instrument data and other large multimodal datasets. The company's REVEALTMAPIs, including REVEAL: Biobank and REVEAL: MultiOmics, are a suite of use case-specific applications that power discovery from population-scale to n-of-1 with FAIR data access and elastically scalable analytics and machine learning.
"By partnering with TetraScience, we are extending the scope of our support to customers in life science and pharma research, enabling them with end-end solutions that readily integrate instrument data such as proteomics data with Omics, imaging, and clinical data to extract biological understanding from their complex datasets," says Marilyn Matz, CEO and Co-Founder of Paradigm4.
"In order to unlock the potential of life science R&D labs and dramatically accelerate discovery, we must capitalize on the power of AI and data science. A precondition to enabling these capabilities is moving the industry away from a legacy data model of silos and point-to-point integrations, to a native and unified cloud-based data paradigm," explains Patrick Grady, Chief Executive Officer, TetraScience. "Our partnership with Paradigm4 is an example of what can now be done to enable the life sciences industry to accelerate discoveries that can help improve lives."
About TetraScienceTetraScience is the R&D Data Cloud company with a mission to transform life sciences R&D, accelerate discovery, and improve and extend human life. The Tetra R&D Data Cloud provides life sciences companies with the flexibility, scalability, and data-centric capabilities to enable easy access to centralized, harmonized, and actionable scientific data and is actively deployed across enterprise pharma and biotech organizations. As an open platform, TetraScience has built the largest integration network of lab instruments, informatics applications, CRO/CDMOs, analytics, and data science partners, creating seamless interoperability and an innovation feedback loop that will drive the future of life sciences R&D. For more information, please visit http://www.tetrascience.com.
About Paradigm4Paradigm4 is an integrated scientific data analytics company co-founded by a Turing Laureate. Its high-performance computing solutions are being used by pharma and biotech companies to uncover new insights in near real-time, capturing discoveries to make rapid progress in their research decision making. Users can streamline hypothesis generation and validation across multi-modal, proprietary and public datasets with the REVEALTMsuite of extensible apps, each of which provides an end-to-end application-specific solution. For more information, please visit http://www.paradigm4.com.
SOURCE TetraScience
Originally posted here:
Paradigm4 Joins the Tetra Partner Network to Leverage Tetra Data into High-performing Analytical Computing Applications for Pharmaceutical Customers -...
Permiso emerges from stealth with $10M to tackle the next wave of cloud security – TechCrunch
Permiso, a Palo Alto-based startup that provides cloud identity detection and response for cloud infrastructures, has launched from stealth with $10 million in seed funding.
The company, founded by former FireEye executives Paul Nguyen and Jason Martin, joins an influx of cloud security startups that have emerged since the start of the pandemic, which saw organizations rush to digitize their operations to support employees working from home.
While arguably 18 months late to a now already-crowded market, Nguyen and Martin believe their detection response product is ready for the next wave of cloud security. The idea was inspired by conversations with Permisos angel investors, including Netflix, who said that their number one problem in the cloud was identity.
We started to realize that identity is the lynchpin that tells that story of whats happening in your cloud, and its also the foundation for how you build detection in the cloud, Nguyen tells TechCrunch.
Permiso provides organizations with visibility for identities in their cloud infrastructure to give real-time insights into who is in the environment and what they are doing. This, the startup claims, allows for simple and efficient attribution of access, activity and changes occurring in monitored environments, helping organizations to spot malicious or anomalous behaviors that could indicate compromised credentials, policy violations or insider threats.
Were a little ahead of the market, Martin said. No one, as far as were aware, other than the custom products that cloud-forward companies build for themselves, is focused on anchoring all activity around identity, the resources in that environment, and how they interact.
Permiso is billed as built by experts, usable by non-experts, which the startup says is key given the growing skills shortage in the cloud security market. When we looked at teams that were transitioning from on-prem[ise] to the cloud, it was like speaking English and trying to learn Farsi as a completely new language. Its starting from zero said Nguyen.
We built for 1% of the market for years, but that gets you 1% of the market. Now were going after the 99%.
The startups $10 million seed round was led by Point72 Ventures and included Foundation Capital, Work-Bench, 11.2 Capital and Rain Capital. It was also backed by a number of security industry leaders, including Jason Chan, former VP of Information Security at Netflix; Travis McPeak, head of Product Security at Databricks; and Tyler Shields, CMO at JupiterOne.
Permiso will use the funds to triple its current 15-person team and expand its current customer footprint.
Our investors have all solved this identity problem at scale in cloud-native problems, but other companies wont get to this point for another year or two, said Martin.
The rest is here:
Permiso emerges from stealth with $10M to tackle the next wave of cloud security - TechCrunch
Federal operations are moving to the edge. IT must follow. – Federal News Network
The federal government operatesan increasinglydistributed mission.Frombattlefields to warehouses toagricultural inspections, agencies need a technology architecturethatadaptstotheirdispersed yet complexcomputing needs.
The urgency of this need can be seen inthe astronomical rate of data growth, particularly at the edgeor beyond the traditional perimeter.By 2023,Gartnerpredictsthat over 50% of the primary responsibility of data and analytics leaders will comprise data created, managed and analyzed in edge environments.Consider sensors that monitorremote environments orwearables collecting insights fromfrontline workers; federalagenciesneedthe infrastructure to process and act on thistype ofdata, often at speed.
Edge solutions are the answer.
By combining localized computing, sophisticated Internet of Things (IoT) devices, and 5G network connectivity, edge solutions will help agencies conductdata-driven,advanced operations, no matter where the mission goes.The technology can deliver more timely analysis to soldiers on the battlefield, and can empower civilian agencies in essential mission sets, from telemedicine to disaster response.
The military recognizesthispotential. TheDefenseDepartments Joint All-Domain Command and Control (JADC2)plans to use edge computingto connect sensorsacrossmilitarybranchesinto a single network.Thiswill givecommanders vital information to drive warfighter performance in real time.
Edge solutionscan be used tocreate aconnectedcontinuum of actionable data from the cloud all the way to theoperationaledge.The technologysgreatest potentialliesinbringingthe best of what the cloud can offer including advanced tools like artificial intelligence and machine learning to the networks last mile.
In a recent report from Accenture,Extending IT to the Missions Edge,more than nine in 10 federaltechnologyleaders say thatedgesolutions areeithervery or extremely important to meeting their agencys mission needs.
For agencies operating dispersed missions orgenerating high volumes of data, edge solutions are critical to a holistic digital transformation.Itwill takea savvy marriage of cloud and edge capabilities to ingest and operationalizeincreasing quantitiesofdataacross alluse cases.Without a focus on the edge,agenciesmay be leaving valuable databehind.
For agencies migrating to the cloud, theresat least one important reasonto consider edge solutions now:Itsmore seamlesstoincorporateedge solutionsearlier in agencies digital transformations from an IT architecture and infrastructure point of viewratherthan bolting on solutions somewhere down the line.
Inaction today also opens the door to security vulnerabilities.Newendpoints could become a cyber liability if IT leaders dont actively plan for their safeguards.More than half (54%) of those surveyedin the Accenture reportcite cyber concerns as an obstacle to edge adoption.
Theresan opportunity to architect for that risk early, by implementinga common control plane in support of cybersecurityacrossbothcloud and the edge.
Edge solutions willbecomean essential element of agencies digital transformation, but careful planning is needed today to bring that potential to fruition.
Federal leaders cantakestepstoday to beginleveragingedgesolutionsin support of improved mission outcomes.How?
Wearesoongoing tosee a divide between those who are able tounderstand and take advantage ofedge solutions, andthosewho arenot.Agencies thatmove onedge solutions will have more intelligent insights to drive action across the mission, anywhere and everywhere.
Chris Bjornson is the Cloud Practice Leadand Kyle Michl is the Chief Innovation Officerat Accenture Federal Services.
Continue reading here:
Federal operations are moving to the edge. IT must follow. - Federal News Network
For IT professionals, these are the 5 job roles to be in demand this year – Mint
Hiring activity has picked up amid demand for fresh talent as well as efforts to retain the top talent, especially in the IT sector, that has been witnessing spike in attrition rates. As the hiring demand surges, here are the some of the top roles and skills that will be in demand this year for Indian IT professionals, as per Divyesh Sindhwaad, Regional Vice President of Skillsoft.
Cybersecurity:According to Skillsoft's2021 IT Skills and Salary report, 52% of IT decision-makers in the APAC region consider cybersecurity a priority for their team. More cybersecurity professionals are needed to protect organizations' valuable data and prevent cyberattacks.
Cloud Computing:This domain has been a critical investment area for many organizations since 2017 until today in the fast-evolving digital-first world. The samereport highlights that cloud computing is a key investment area for 43% of decision-makers. So, equipping yourself with cloud computing skills such as DevOps, Serverless Architecture, Automation, and QA will boost your career.
Big Data:The art and science of collecting, analyzing, and using data securely to make informed decisions, data management is the priority for IT leaders. But organizations struggle to find qualified talent to fill big data jobs and inAPAC alone, 23% of respondents have said the same.
Artificial Intelligence and Machine Learning: AI and ML have become part of our daily lives. They significantly drive automation that simplifies systems across industries, requiring organizations to constantly look for professionals skilled in this area. Job roles in this segment will be prioritized by 34% of APAC leaders.
Internet of Things: With more and more devices connecting with each other today and adoption rates soaring, IoT has become a hot new technology for businesses. From its infancy in 2015 to 2021, IoT devices have tripled and so is the requirement for IoT engineers. But it is one of the weaker skills set as reported by organizations either due to early-stage adoption or lack of skilled professionals.
Apart from these tech skills, individuals looking to kick start or advance their careers need to also focus on power skills. Skills such as analytical and critical thinking, complex problem solving, leadership, and social influence are touted as the skills of the future according to World Economic Forums2022 Skills Outlook," said Sindhwaad.
Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.
Never miss a story! Stay connected and informed with Mint. Download our App Now!!
See more here:
For IT professionals, these are the 5 job roles to be in demand this year - Mint
Play the Bearish Side of Cloud Computing – ETFdb.com
Its been nothing but cloudy skies for the cloud computing space, allowing traders to play the bearish side of the technology sub-sector.
Even before cloud computing gained increased popularity during the height of the pandemic in 2020, the space was already seeing strength. More companies were moving their operations to internet-based applications powered by cloud computing platforms.
Big tech companies like Amazon and Microsoft were leveraging cloud computing technology to power their core operations. That only proliferated during the pandemic as more companies were forced to use cloud computing in order for remote work employees to access applications and interact with co-workers.
However, a confluence of events is starting to push cloud computing out of favor for investors. As the economy starts to re-open again despite rising COVID cases, the need for cloud computing may have weakened.
The ISE CTA Cloud Computing Index has fallen about 8% within the past few months. In addition to the economy re-opening, the Federal Reserve is looking to raise interest rates aggressively in 2022, which may hamper growth for cloud computing.
The slump, which started in November and deepened this week, is part market rotation, part economy reopening from the pandemic, and part concern that the Federal Reserves expected interest rate hikes will have an outsized impact on this particular sector, CNBC notes.
Higher interest rates can spell challenges for much of the market, but they represent a notable roadblock for cloud stocks, especially for companies that arent making money yet, CNBC adds.
As bearishness permeates in the cloud computing space, one way to capture the downtrend and profit is the Daily Cloud Computing Bull and Bear 2X Shares ETFs (CLDS). The fund is up 26% within the past three months, underscoring the bearish sentiment in the industry.
The fund seeks to achieve 200%, or 200% of the inverse, of the daily performance of the IndxxUSACloud Computing Index. The index includes domestic companies that deliver cloud computing infrastructure.
Across the basket, the cloud industry and software holistically has just been hammered, saidByron Deeter, a venture capitalist at Bessemer. Fundamentally these businesses remain the drivers of the new economy, and we have to remember that all of those trends that people were excited about a year ago in the 2020 market, when this basket returned almost 100%, those remain today.
Go here to see the original:
Play the Bearish Side of Cloud Computing - ETFdb.com
How is Cloud Computing Changing the Logistics Industry? – Analytics Insight
Here is how cloud computing is changing dynamics in the logistics industry
Cloud computing is among the most crucial technological advances of our era. Its incredible to be able to store and access data from anywhere in the globe using any system. Many wouldnt have believed you if you told them about it a few years ago because its such a novel notion. Technology, on the other hand, has made it a reality, and it has spawned a slew of new industries. Logistics management is one of these sectors, which allows people to manage their assets. This is a problem that must be addressed because it has the potential to halt the spread of this incredible technology. In this article, you will learn about how cloud technology is changing the logistics industry.
There used to be a lot of distinct factors that needed to be handled independently, which was a time-consuming procedure. Logistics and supply chain management may become very difficult, very quickly, and if you cant keep up, youll find yourself in a lot of trouble. You have a never-ending list of items to manage, such as receipts, stock, shipping, and so on. Cloud computing, on the other hand, has single-handedly transformed everything by providing unrivaled integration, bringing everything else onto a single system.
When supply chain management is constrained by arbitrary political and territorial borders, it cannot achieve its full potential. Cloud-based technologies, on the other hand, address this issue, making it relatively simple to expand ones business beyond ones own boundaries. Cloud computing has managed to incorporate everything so closely that even a fulfillment warehouse in the United Kingdom can easily handle assets in a country on another continent, such as Australia. Because of cloud computings capacity to work without regard for geographical boundaries, the globe has become a more linked place.
Most governments throughout the world are taking every precaution to guarantee that cloud-based platforms are appropriately regulated. Because of these factors, cloud computing is more reliable than any other data management and storage method. Furthermore, the information is not stored in a single location, which protects against data loss due to ransomware or hacker attacks. When compared to traditional logistical methods, knowing that your data cant be entirely wiped gives you peace of mind.
Whether youre dealing with little quantities or millions of dollars worth of products, cloud computing can ramp up and down to meet your needs. No other choice provides us with this level of control over our businesss scalability, and cloud computing has a stranglehold in this sector. Because both of these aspects generally go hand-in-hand, this works nicely with the development aspect.
If youve not already made the switch, you may be surprised to learn that the cloud is much less expensive than traditional options. Despite its youth, its clear to see why the cloud is so inexpensive. Because the amount of work necessary to execute a single activity is little, the expenses are low. Operational costs make up a large portion of every logistics companys expense and lowering them reduces all other expenses as well.
Cloud computing, contrary to popular assumptions, is incredibly straightforward to integrate because it is so easy to master. If you like, you can manage all of the processes from a single interface, and with a little work, you can grasp the front end. This means you wont have to entirely retrain your current staff; instead, youll be able to reskill them in a short amount of time.
Share This ArticleDo the sharing thingy
More:
How is Cloud Computing Changing the Logistics Industry? - Analytics Insight