Category Archives: Cloud Hosting

What our Uncapped Hosted PBX in the Cloud can offer your SME – ITWeb

More employees are working remotely than ever before. As a result, the traditional telephony system is becoming increasingly unpractical. Domains.co.za's Uncapped Hosted PBX in the Cloud is a cutting-edge telephone solution that enables businesses to effortlessly place and receive perfectly clear phone calls from any location with internet access.

1. Unparalleled flexibility

With our solution, customers can make and receive calls seamlessly from anywhere where they have a stable internet connection. This unmatched flexibility empowers them to work remotely, collaborate effectively and stay connected when on the go.

2. Crystal-clear communication

Our hosted PBX boasts crystal-clear VOX voice call quality, providing an exceptional communication experience. Say goodbye to static, dropped calls, and poor sound quality. Enjoy conversations that are crisp, reliable and free from interruptions.

3. Cost savings

By adopting our uncapped cloud-based PBX system, customers can significantly reduce their telephony expenses. There's no need for expensive hardware installations or maintenance. Plus, with affordable pricing plans, your SME can optimise its communication costs and allocate resources to other important areas of the business.

4. 3CX software and Domains.co.za hosting

Hosted PBX in the Cloud harnesses the remarkable capabilities of the acclaimed 3CX platform. This award-winning platform offers an array of cutting-edge features, including built-in security, backup options, live chat functionality and seamless video conferencing. Added to this, our solution is reinforced by robust, locally hosted infrastructure, which offers an impressive 99.99% uptime record.

5. Management by Domains.co.za

Your SME can benefit from our professional monthly management, which is included in the monthly fee. This way you can rest assured that your system is in expert hands, allowing you to focus on your core business operations.

By adopting Uncapped Hosted PBX in the Cloud, SMEs can optimise their communication expenses, while enjoying the advantages that come with the solution. The cloud-based nature of the system ensures flexibility and scalability, allowing businesses to easily adapt to their evolving requirements. Whether it's remote work, flexible schedules, or distributed teams, hosted PBX seamlessly accommodates these changes.

View post:
What our Uncapped Hosted PBX in the Cloud can offer your SME - ITWeb

Twitter explains why it sabotaged your feed without notice but no … – TechRadar

On July 1, with no warning and little reason why, Elon Musk the CEO of Twitter said that Twitter would be imposing strict rate limits on how many posts users could view. Now we have an official reason why, butit's still unclear how long this will last.

When Musk put out his tweet he said that Verified accounts would be limited to reading 6,000 posts a day, unverified users to 600, and new unverified users to 300. His explanation for this sudden change was that it was to address extreme levels of data scraping & system manipulation. The limit has since been expanded to 10,000 for verified and 1,000 for unverified users, but exactly what the problem is remained uncertain.

A post Twitter made on its Business blog has finally expanded on why it chose to limit how many posts someone could read a day. Twitter explained that the move was one of its latest extreme measures to remove spam and bots from our platform. The post goes on to say that Twitter is working to prevent bad actors from 1) scraping peoples public Twitter data to build AI models and 2) manipulating people and conversation on the platform in various ways. Twitter adds that the moves effects on advertising have been minimal, likely in an effort to convince its advertisers that the posts theyre paying to promote are still being seen by Twitters user base.

Unfortunately for those of you hoping that post limits would be on their way out that doesnt appear to be the case. Twitters blog post gave no indication of when or even if the limits will go away.

So we have a deeper understanding of why Twitter is limiting access to tweets, but is this the full story? Some would argue not, based on previous reports that argued Twitters limits have nothing to do with data being scraped.

One alternative explanation was that Google was limiting Twitters access to its cloud hosting services after it was reported last month by Insider that the social media platform hadnt being paying its bills. It appears Twitter paid at least some of its debt to Google after new CEO Linda Yaccarino took charge however, suggesting that this isnt the reason for the limits.

Another possibility came from web developer and Twitter user Sheldon Chang with Chang speculating that the change was imposed because the platform was effectively DDOSing itself. DDOS attacks are when someone intentionally sends a lot of requests to a platforms servers in order to overwhelm them and stop regular folks from accessing them; according to Chang, Twitters home feed wasnt loading properly on the Saturday morning before Musks announcement yet the company was still making constant requests for data that never arrived. Chang argued that the resulting instability was caused by another of Musks recent changes which forces users to log in before they can view content on Twitter, however, this is merely speculation.

For now, well have to go along with Twitters version of events until someone can prove otherwise, though the reasons do make some sense particularly the part about AI. Considering Musk is working on his own generative AI, it would follow that he may not want his competitors to build models based on data taken from his social media platform.

Annoyed about the rate limits and want to leave Twitter? Here are three Twitter alternatives to help you move on from it. You might also want to check out Bluesky and Meta's Twitter rival.

Here is the original post:
Twitter explains why it sabotaged your feed without notice but no ... - TechRadar

Rekha Jhunjhunwala raises stake in this Tata group stock in June quarter – The Economic Times

Ace investor Rekha Jhunjhunwala has increased stake in Tata Communications during the June quarter. According to the latest shareholding data available with the exchanges, Rekha has added 1.34 lakh shares in the Tata group company, taking the total holding to 1.84%.As of March quarter, the shareholding stood at 1.79%.Tata Communications is a majority promoter owned company with 58.86% stake, while public shareholders have the remaining 41.4%. Among public shareholders, mutual funds own about 9.95% stake and foreign investors are sitting with 16.63%.A part of the Tata Group, Tata Communications enables digital transformation of enterprises globally with collaboration and connected solutions, core and next gen connectivity, cloud hosting and security solutions and media services. As many as 300 of the Fortune 500 companies are its customers and the company connects businesses to 80% of the worlds cloud giants.On Friday, the company's shares closed 0.95% higher at Rs 1,551 on the NSE. So far this year, the stock has gained 18% on a year-to-date basis.

As per Trendlyne data, the average target price of the stock is Rs 1,420, which implies a downside of 8% from the current levels.

Apart from Tata Communications, Rekha has cut down stake in Raghav Productivity Enhancers. The star investor had picked the smallcap stock during the March quarter.

Raghav Productivity Enhancers is one of the largest ramming mass producers in the world. Ramming mass is used as furnace refractory lining material for steel companies.

Follow this link:
Rekha Jhunjhunwala raises stake in this Tata group stock in June quarter - The Economic Times

Box Achieves Health Data Hosting (HDS) Certification – StreetInsider.com

Box Content Cloud validated to protect and manage confidential and sensitive health data in France

PARIS--(BUSINESS WIRE)--Box, Inc. (NYSE: BOX), the leading Content Cloud, today announced that it has achieved the Hbergeur de Donnes de Sant or Health Data Hosting (HDS) certification. With the Box HDS certification, companies that work with and in the French healthcare industry can now confidently secure, store and manage French protected health information (PHI) in the Box Content Cloud.

HDS was introduced as a framework to safeguard health data in France by the Agence du Numrique en Sant (ANS). Required by a 2018 revision to the French Public Health Code (Article L. 1111-8), HDS certification is mandatory for all companies hosting sensitive medical data. To achieve HDS certification, Box had to prove it met strict standards for storing and processing health data in France.

Data security is paramount in the healthcare industry, where patient privacy and confidentiality are of critical importance, said Manu Vohra, Managing Director, Global Life Sciences at Box. Earning the HDS certification demonstrates Boxs continued commitment to maintaining the highest standards of data protection and privacy, and will allow healthcare and life science companies to leverage the benefits of cloud content management while maintaining compliance with French government standards.

Theranexus is a biopharmaceutical company developing drug candidates for the treatment of rare neurological diseases. With more than 5,000 rare neurological diseases affecting nearly 350 million people worldwide, we see this as both a major societal challenge and a market in crucial need of therapeutic innovation, said Thierry Lambert, Chief Financial Officer at Theranexus. Since 2020, we have utilised Box for secure content management. As a company based in France, it is great to see Box achieve the French HDS certification and remain committed to helping life science customers like Theranexus proactively navigate todays dynamic security and compliance landscape.

The Box HDS certification was achieved through a collaboration with Bureau Veritas, an independent third-party auditor accredited by French authorities to conduct HDS audits. Box is now listed on the ANS website as an HDS-certified host.

At Box, our goal is to meet the high-water mark for security and compliance in every industry worldwide. To do so, we do the heavy lifting on behalf of our customers to make it easy for them to remain compliant, said Tom Cowles, Chief Compliance Officer at Box. HDS is a further example of how weve built a robust data protection framework that allows us to deliver a secure content platform to help customers meet and exceed their regulatory and compliance obligations.

With this announcement, Box adds HDS Certification to a growing list of security compliance certifications, standards, and reports, including:

Box empowers many of the largest and most regulated enterprises around the world to accelerate business processes, power their workplace collaboration, and protect valuable information. The Box HDS certification is the companys latest announcement in the French market. In 2022, Box expanded the global network of Boxs flagship Box Zones offering to include a France Zone, aimed at delivering flexibility for in-region storage at scale. For more information on Boxs security and compliance offerings, visit the Box Trust Center.

About Box

Box (NYSE: BOX) is the leading Content Cloud, a single platform that empowers organizations to manage the entire content lifecycle, work securely from anywhere, and integrate across best-of-breed apps. Founded in 2005, Box simplifies work for leading global organizations, including AstraZeneca, JLL, Morgan Stanley, and Nationwide. Box is headquartered in Redwood City, CA, with offices across the United States, Europe, and Asia. Visit box.com to learn more. And visit box.org to learn more about how Box empowers nonprofits to fulfill their missions.

View source version on businesswire.com: https://www.businesswire.com/news/home/20230706363608/en/

Media ContactTiffany Kirkland[emailprotected]

Source: Box, Inc.

See the article here:
Box Achieves Health Data Hosting (HDS) Certification - StreetInsider.com

Future-proof media, entertainment depends on making the right technology choices – ITWeb

Calvin Huang, Solutions Architect, Huawei Cloud South Africa.

Media and entertainment streaming services and content platforms are grappling with multi-pronged challenges of managing massive volumes of content, securely, while also adapting to changing consumer demands. Investing in the right technologies can help them overcome these challenges.

This is according to Calvin Huang, Senior Solution Architect at Huawei Cloud South Africa, who says media businesses must shift their focus from purely content to the technologies that underpin their business.

Huang says: In recent years, the media sector has undergone significant changes, and has been influenced by various technologies, such as the cloud, big data, AI and machine learning. At the same time, people are spending more time on streaming services, OTT platforms and social media, and traditional TV viewership has declined. This shift has led to the creation of more original content by streaming platforms. Multimedia has become more interactive and immersive, with live streaming, VR, interactive and 360-degree experiences allowing audiences to actively participate in and explore the content.

Data volumes have exploded, he says: Major platforms for sharing and transferring content are seeing massive increases in the amount of content being uploaded and shared on YouTube alone, it is reported that over 500 hours of video content is uploaded every minute. Platforms like Netflix also deliver massive volumes of video and audio content. Social media is another major contributor to sharing media content as of 2022, Facebook had nearly three billion monthly active users, while Instagram and TikTok each had around one billion monthly active users, sharing and engaging with photos, videos and live-streamed content.

But these well-known platforms are not the only source of multimedia content growth. Huang says: Collaboration tools and cloud storage like Huawei WeLink, Google Drive and Microsoft OneDrive have made it easier to transfer and collaborate on multimedia files. As the technology evolves, the volumes of media data being shared and collaborated on, will continue to increase.

He adds the smartphone has become a primary device for multimedia content consumption. The wide availability of high-speed internet and popularity of social media has driven growth in mobile-first short form content, which offers a lot of opportunity.

Another trend is that the abundance of content available has made personalisation and recommendations very important. Platforms must continually analyse user data to present relevant content and advertisements for a more engaging experience.

Huang says several new challenges confront media companies. There are challenges around both the technology and user experience. Streamed media needs a stable connection and sufficient bandwidth to deliver seamless content, with no buffering. Content providers must focus on content delivery latency and optimising the user experience, especially when providing real-time interactive content such as livestreamed sports events. A streaming platform must also have scalable infrastructure capable of handling massive numbers of concurrent users without degradation in performance. In addition, there are security, compliance and content protection concerns.

To help overcome these challenges, media companies need advanced, secure, scalable and cost-effective media infrastructure solutions, Huang says.

Media companies need a cost-effective media infrastructure solution to reduce capital expenditure, with multiple billing methods and dedicated teams to help clients design the billing model. They need scalability and elasticity in their cloud infrastructure to enable them to handle varying workloads and sudden spikes in traffic, and a global network of data centres and CDN edge nodes to ensure they can deliver content globally with low latency and high availability, he says.

Importantly, media companies need technologies that support security and compliance. Media providers need to invest in cyber security and data protection, with robust DRM technology. They must also continue to invest in, and upgrade their technology to deliver the best possible experience and meet changing needs."

Huawei Cloud is a top ICT and cloud solution provider, with various technologies and solutions to support content creation, management and distribution for media organisations, Huang says.

On the networking and telecommunications side, we provide networking infrastructure and solutions for high-speed content transmission, with 5G, optical transmission systems, IP networks and wireless solutions. For media processing and delivery, we have a media processing research centre in Europe, with innovations such as HD low bitrate technology to ensure that during delivery there is no compromise in quality, while saving on bandwidth, storage space and cost, he says.

Huawei innovations also include AI image recognition and content moderation technology to help streamers ensure legal content streaming and the Digital Human intelligent virtual human for advertising and branding.

Huang concludes: We also offer cloud computing and infrastructure as a service through Huawei cloud with hosting for media applications, platform-as-a-service to build content and tools for management and distribution.

Huawei Cloud Live, a one-stop solution built on years of video expertise, transmits live content at low latency and delivers smooth HD video even when there are massive concurrent requests. It has over 2 800 nodes worldwide, delivering over 100Tbit/s bandwidth, and 24/7 network-wide health management.

Link:
Future-proof media, entertainment depends on making the right technology choices - ITWeb

Twitter tries to fix its latest mistake as Meta readies its Threads launch – TechRadar

Twitters most recent guffaws include setting daily rate limits on tweets and forcing users to be logged in if they want to even view tweets at all. But it seems that with the launch of Metas supposed Twitter-killer Threads, Twitter is trying to roll back at least one of those changes.

According to TechCrunch, not only are users able to view tweets without an account (or at least being logged into one), tweet preview links are properly unfurling again in other apps like Slack, WhatsApp, and iMessage. According to former Twitter CEO Elon Musk, Twitter enacted these limits to combat extreme levels of data scraping & system manipulation. He went on to explain that it was a temporary emergency measure, due to getting data pillaged so much that it was degrading service for normal users!

Though another reason for the enacted policies may have been due to Google limiting Twitters access to its cloud hosting services. It had been reported last month by Insider that the social media platform hadnt been paying its bills.

It was also speculated by web developer and Twitter user Sheldon Chang that the site was essentially DDOSing itself. These attacks occur when someone intentionally overwhelms a site by sending too many requests, keeping normal users from accessing it. According to Chang, the company constantly made data requests despite the site not loading properly, which directly resulted from the login requirement.

Meanwhile, we have the imminent launch of the Instagram app Threads on July 6, 2023, which is described by Meta as a stand-alone decentralized social network for sharing text updates." The tech giant believes that "theres an opportunity for a separate space where creators and public figures can share timely updates about their interests.

It has all the basic hallmarks of Twitter like, comment, repost, and share buttons at the bottom of each post and lets you avoid immediate news feed clutter by letting you transfer your current Instagram follower list. It also lets you reserve your Instagram username if you already have an account. Judging from the recently leaked images, its made to resemble Twitter as much as possible.

It seems that Meta is gearing towards competing with Twitter while capturing the growing number of disgruntled users, thanks to Elon Musks constant policy changes and questionable decisions. Well see soon if Meta is capable of what other Twitter lookalikes havent been able to accomplish yet actually replacing Twitter as a preferred social media platform.

Read more here:
Twitter tries to fix its latest mistake as Meta readies its Threads launch - TechRadar

DataBank Celebrates the Opening of the New Jersey Institute of … – PR Newswire

DataBank is a leader in supporting High Performance Computing (HPC) environments, providing adequate power density and cooling requirements for college research institutions.

DALLAS, May 25, 2023 /PRNewswire/ --DataBank, a leading provider of enterprise-class colocation, connectivity, and managed services, congratulates the New Jersey Institute of Technology(NJIT) on the opening of its Wulver High Performance Computing environment, now live in DataBank's Piscataway, NJ EWR2 data center.

"Research has propelled NJIT's growth as a leading national university, and partnering with DataBank will dramatically improve the performance of our research computing environment," said Atam Dhawan, interim provost and senior executive vice president. "This partnership supports NJIT's research mission to advance the knowledge base through new discoveries, and basic, applied and translational research and technology development."

NJIT's steady rise in research activity and output has necessitated the upgrade. NJIT is one of only 146 universities nationwide to earn the R1 ranking by Carnegie Classification the highest research designation and one of just three R1 universities in New Jersey. Research expenditures have surpassed $160 million, up nearly $70 million since 2010. NJIT comprises 150 research institutes, centers, and laboratories, up from 31 in 2010.

The university aims for national and international prominence in research through new discoveries in areas ranging from medical sensors and devices to robotics, to nanotechnology, to cybersecurity, to next-generation materials, among other topics of vital importance in basic, applied, and translational research. Five research clusters exist within the research enterprise of NJIT: Bioscience and Bioengineering; Data Science and Management; Environment and Sustainability; Materials Science and Engineering; and Robotics and Machine Learning.

DataBank is also announcing the creation of a new higher education program and partnership with NJIT where DataBank will offer internships, co-op, and scholarships to students at NJIT and other higher education institutions. "Tailoring this HPC hosting environment was a complicated effort that required close collaboration between key stakeholders from NJIT and DataBank," said Paul Attallah, national account manager at DataBank. "This teamwork inspired us to develop this mutually beneficial program. DataBank will benefit from the contributions of NJIT's students, and NJIT will give its students a chance to advance their careers."

DataBank delivers High Performance Computing environments to NJIT and other higher education institutions in support of their research programs as well as other organizations with requirements for high powered compute and storage.

DataBank and NJIT will celebrate the opening of its HPC environment at the Databank campus on May 31 and will host a live symposium later in 2023.

For more information about available services at Databank's colocation facilities or to request a quote, visit databank.comor call 800-840-7533.

About DataBank

DataBank enables the world's largest enterprises, technology, and content providers to consistently deploy and manage their infrastructure, applications, and data on the right platform, at the right time, in the right place. Our colocation and edge infrastructure footprint consists of 60+ data centers and 20 interconnection hubs in 30+ markets, on-ramps to an ecosystem of cloud providers, and a modular edge data center platform with virtually unlimited reach.

We combine these platforms with contract portability, managed security, compliance enablement, hands-on support, and a guarantee of 100% uptime availability, to give our customers absolute confidence in their IT infrastructure and the power to create a limitless digital future for their business.

To learn more or tour a facility, visit databank.com or call 1 (800) 840-7533.

SOURCE DataBank

See the original post here:
DataBank Celebrates the Opening of the New Jersey Institute of ... - PR Newswire

The Importance of Cloud Computing for Your Business: Benefits and Adoption Strategies – BBN Times

The Importance of Cloud Computing for Your Business: Benefits and Adoption Strategies

Businesses of all sizes are turning to cloud computing in today's fast-paced digital environment to streamline operations and boost bottom lines.

Cloud computing offers numerous benefits to help businesses meet their goals more quickly and efficiently; we will explore its importance and how it could assist your company. In this article, we'll look into its matter to your organisation and explore all its possible uses.

Before exploring the importance of cloud computing for your business, it's essential that you fully comprehend its nature. Cloud computing essentially refers to delivering computing services over the internet - such as storage, software and processing power, as well as analytics and machine learning services. Effectively, instead of hosting applications and data on servers owned by yourself or an in-house provider directly on the servers located elsewhere via an intermediary provider who makes them accessible via the internet.

One of the critical advantages of cloud computing for businesses is cost savings. No longer must they invest in expensive hardware and pay for its maintenance or upkeep - instead, they pay for services they use, with flexible scaling based on demand. This allows them to reduce infrastructure costs while decreasing IT staff requirements due to cloud computing's flexibility.

Cloud computing offers greater scalability and flexibility than traditional IT infrastructure, allowing you to rapidly adjust resources up or down according to business demands without investing in extra hardware or software. As a result, cloud computing enables rapid response time when meeting changing customer demand without incurring additional hardware or software expenses.

Cloud computing also facilitates teamwork between members regardless of physical location, providing access to vital information that enables project collaboration in real-time - helping improve productivity while decreasing time and costs associated with traditional communication methods like email or phone.

Cloud computing providers invest heavily in security and reliability, which ensures your data and applications are often safer and more reliable than on your servers. In addition, they typically offer multiple layers of protection like firewalls, encryption and 24/7 monitoring support that help safeguard against cyber threats while ensuring it remains accessible when necessary. This can provide essential protection from potential security threats while guaranteeing you access when you need it most.

Now that you understand the significance of cloud computing for your business, you may be asking how your enterprise can adopt it. Here are a few steps you should follow in this endeavour:

The initial step to adopting cloud computing is assessing your business needs. Consider your goals, the challenges you are experiencing and which tools and applications you currently rely upon - this will allow you to identify where cloud computing may assist.

Just like with any new technology, training your staff to learn more on how to use cloud-based tools and applications is crucial in taking full advantage of all its benefits while mitigating the risk of user error or security breaches.

Cloud computing has quickly become an indispensable business asset, offering cost savings, flexibility, scalability, increased collaboration and productivity, improved security and reliability, as well as agility and innovation for businesses of all sizes. Adopting cloud computing can help businesses meet their goals more quickly while decreasing IT staff and infrastructure requirements - by following these steps, they can migrate successfully into the cloud and start reaping its many advantages.

Here is the original post:
The Importance of Cloud Computing for Your Business: Benefits and Adoption Strategies - BBN Times

Size of the Prize: Assessing the Market for Edge Computing in Spaces – Via Satellite

Via Satellite illustration.

This is the second of a two-part series analyzing the value of edge computing in space by the Boston Consulting Group. Read Part One: Size of the Prize: How Will Edge Computing in Space Drive Value Creation?

What key drivers are necessary to ensure that edge computing in space is widely adopted to the degree that it reaches an inflection point of affordability? We at the Boston Consulting Group believe that cybersecurity, cost, and ESG will drive the market for edge computing in space.

Cybersecurity is an area in which edge computing offers distinct advantages. Cloud computing is vulnerable to the ever-increasing risk of cybersecurity breaches, which can lead to major data theft or loss. Organizations across industries that collect personally identifiable information on a public cloud expose themselves to liability and/or compliance concerns, while sensitive intellectual property and proprietary industry data can become vulnerable to cybersecurity attacks at various nodes of transmission particularly given growing dependency on cloud computing.

The main challenge presented by the current cloud computing landscape is that corporate services and data are entrusted to third parties and are exposed to a higher level of risk, both in terms of security and privacy. The top three threats to cloud systems are unsafe API interfaces, data loss or theft, and hardware failure. The widespread use of virtualization in the implementation of cloud infrastructure also creates security problems because it alters the relationship between operating systems and underlying hardware, introducing an additional level that must be managed and protected.

In contrast, edge computing introduces multiple advantages for cybersecurity since data is processed locally. This eliminates risks stemming from data transfers, which are typically encrypted and inevitable when using typical terrestrial cloud solutions. With edge computing, complex calculations occur at the IoT device/perimeter server level and the only transfer is that of the final result to the user. The risk of data loss is driven more by damage to local servers, rather than cybersecurity vulnerabilities.

Cost also presents an area of advantage to edge computing. Organizations could achieve operational cost savings by using edge computing due to the minimal need to move data to the cloud. Since data is processed at the same location where it is generated (in this case, on the satellites themselves, collecting imagery through hyperspectral or SAR capability or remote sensing data), processing these batches of data on the same satellite would also yield a significant reduction in the bandwidth needed to handle the data load.

Hosting applications and data on centralized hosting platforms or centers creates latency when users try to use them over the internet. Large physical distances coupled with network congestion or outages can delay data movement across the network. This then delays any analytics and decision-making processes.

Edge computing in space, in this context, could enable data to be accessed and processed with little or no obstacles, even when there is poor internet connectivity. Importantly, if there is failure with one edge device, it will not destroy the operation of the other edge devices in the ecosystem, facilitating a reliable, connected system.

Finally, there are potential gains to be achieved in terms of ESG metrics by adopting in-space edge computing capability. With the cloud business model dominating, there are emerging concerns about the environmental effects of centralized processing. Processing centers require enormous resources to function; they contribute to carbon emissions, accounting for 0.6% of all greenhouse gas emissions, and produce electronic waste, adding to the burden humans put on the environment in pursuit of advancement.

Edge computing has become a potential alternative to moving data centers to greener practices. The edge helps reduce the networking traffic coming in and out of centralized servers, reducing bandwidth and energy drains. This frees up bandwidth at the data center itself and bandwidth for the organization, overall, in terms of any centralized servers on-premises. Moving edge computing to space would achieve even further reductions in energy consumption required at the terrestrial data center level, while the needs for temperature control and cooling would be eliminated by the freezing temperatures in LEO.

In order to estimate an overall market for edge computing in space and explain why in-space edge computing capability and associated user interface applications need to be built, we triangulated three approaches to the market: Supply, Demand, and Cost.

Today, roughly 20% of data processing and analysis occurs locally, with 80% happening in centralized data centers and computing facilities.

We developed a high, low, and base case for estimating the share of industry addressable by space solutions, and as a core assumption of the model, we used reliance on cybersecurity to gauge what share of industry would be addressable by space. With this model, we expect an estimated $250 million market by 2030 with defense and satcom as leading industries for application. However, it is important to note that the estimated $250 million market is addressed by only one segment of the total scope available as one looks at the Edge Computing in Space Capability Stack (Figure 2).

Figure 2: The capability stack for edge computing in space demonstrates the breadth of functions which could be enabled and supported for different end users. Source: BCG analysis.

Further upside would emerge as addressable market opportunity for connectivity service providers (satcom/telecom), applications developers (who would be responsible for developing the apps for the specific government customer to interpret processed information, for example); terminals/user interface manufacturers; and the residual flow down to data centers for cloud computing purposes. Other segments of Edge Computing in Space Capability Stack would see further value unlocked as Edge in Space comes online, delivers key capabilities to the highest need customer groups (e.g., those in defense), and brings the cost curve down for commercial use cases and applications to emerge.

By estimating demand for cloud computing across target industries, supply for satellite revenue in the aggregate space market, and comparing the cost of terrestrial and space data storage centers, we believe that there is more demand for cloud computing in space than the supply of satcom providers.

Our model indicates that the cost to host data in space will closely approach terrestrial data costs past 2030, while on supply and demand, we anticipate more demand for cloud computing space than supply of satcom providers.

In light of these differentiating factors and our model research, demand for edge computing is established and expected to grow (Figure 3). We project all of Satellite IoT spending, $1.5 billion by 2030, to be addressable given the importance of cybersecurity. We estimate the relevant edge computing market (excluding hardware and non-core service software) to be $0.3 billion by 2030, of which 75% would be in-scope. Finally, we estimate up to 2% of the total $1.2 billion cloud compute market by 2030 to be in-scope due to the selective applicability of cybersecurity and latency needs for real-time analysis.

However, research indicates that supply is currently lagging behind expected need due to insufficient public and private investment, with key implications for government and private investors.

The key drivers to understanding which companies will unlock the potential of edge computing in space include prioritizing cybersecurity, lowering cost burden, and adopting ESG practices. With increasing digitalization, the space economy will further benefit from integrating edge computing into space-based business models. However, companies and governments must help develop the needed supply that our current space investment demands.

While cloud computing will remain an integral part of the overall market for the foreseeable future, the advantages offered by edge computing in space are clear enough that actors in the most promising markets of defense and agriculture should be considering the questions posed earlier. For government, how can they leverage this technology to enhance the security of critical assets and information? How should government invest in developing the market for space-based edge computing, and how can they effectively support its growth? What role will incentives play will they be tied to ESG targets?

For industry, there are questions around how to sell to target customers in key markets such as government and agriculture. Are the start-up and non-recurring engineering costs prohibitive and what investments and partnerships will be required? What scenarios exist for the development of requisite ground infrastructure?

Go to market success will require integrating the edge computing in space-as-a-service capability into a suite of other services that could already be on offer. In addition, as commercial space stations look to develop edge computing in space offerings, successful methods will integrate this capability among others in orbit, such as where and how remote sensors collect the data, where and how the data analytics are performed, and potentially offering various data streams to the same group(s) of customers utilizing the same sensors to optimize quality and quantity of output.

The space industry is no stranger to partnering closely with suppliers and customers, including governments, to develop and deliver new technology and advance the art of the possible. By making the right investments, governments, investors, and users in edge computing can turn democratizing space from an expression into a reality.

This paper is the second two-part series analyzing the value of edge computing in space by the Boston Consulting Group. Read Part One: Size of the Prize: How Will Edge Computing in Space Drive Value Creation?

S. Sita Sonty leads Boston Consulting Groups Commercial Space team. John Wenstrup is a senior leader in BCGs Technology, Media & Telecommunications practice. Cameron Scott is Global Sector Lead for Defense and Security. AndDr. Hillary Child is a Project Leader from BCGs Chicago office.

Additional research by Avril Prakash, Sarvani Yellayi, Ansh Prasad, and John Kim

Follow this link:
Size of the Prize: Assessing the Market for Edge Computing in Spaces - Via Satellite

Cloudflare Is Fixing the Biggest Problem with Its AWS Killer – The Motley Fool

Amazon Web Services is a hulking mess of a cloud computing platform. It offers hundreds of distinct products and services, many of which overlap, with pricing schemes that almost seem designed to confuse. Enterprises love AWS, but then again, they also have armies of IT staff to figure it all out.

There's a big market for cloud computing simplicity. Platforms like DigitalOcean, Akamai's Linode, Vultr, Netlify, Vercel, and many others aim to make life as simple as possible for developers. In some cases, that means a highly curated list of services with dead-simple pricing. In others, it means an opinionated serverless platform that makes deploying applications a breeze.

Cloudflare (NET -0.91%) is taking the latter route with its serverless Workers platform. With Workers, developers can deploy full-stack applications to Cloudflare's global fleet of edge servers. Code runs nearly instantly as close to the end user as possible.

Cloudflare has been building an ecosystem around Workers for the past few years, rolling out products including R2 object storage, Pub/Sub for messaging, and the D1 relational database. But there's been one glaring problem that has made the whole platform a non-starter for many potential customers, and Cloudflare finally has the pieces in place to fix it.

Any useful web application needs some sort of database. There are countless database software options to choose from. In the world of relational databases, there's Oracle, Microsoft's SQL Server, and open-source options like MySQL and PostgreSQL. Outside of relational databases, there's the document-based MongoDB, key-value store Redis, and a slew of others. Some applications may use multiple databases, while others may stick to one. But somewhere, data needs to be persisted in an orderly, queryable way.

The one thing that almost all databases have in common is that they don't work over HTTP. In other words, hitting the database isn't just a simple API call. Databases generally use lower-level, long-lived TCP connections with custom protocols.

Up until now, Cloudflare's Workers did not support TCP connections. That means that a developer running a database somewhere else could not use Workers with that database without resorting to a middleman. A developer could set up a server application outside of Cloudflare that accepts API calls from a Worker, pulls data from the database, and returns it to the Worker, but the Worker could not access the database directly.

That's an annoying problem. Annoying enough, in all likelihood, to make Cloudflare Workers a non-viable option for many developers. The good news is that Cloudflare finally has the beginnings of a solution to this problem. On Tuesday, the company announced a new feature for Workers that allows outbound TCP connections.

Cloudflare Workers can now connect to any service that accepts TCP connections. There are some caveats, though. For connecting to databases, the database driver library a developer uses to initiate and manage the connection must explicitly support Cloudflare's solution. Cloudflare is working on broadening support, but there are currently big gaps.

Database access was the missing piece of the puzzle for Cloudflare workers. Once a Worker can connect directly to any database, the potential of the platform greatly expands. Suddenly, a developer hosting a database on AWS, or on any other cloud platform, can run their actual application on Cloudflare Workers, benefiting from the global reach of Cloudflare's network.

Serverless platforms like Cloudflare Workers take cloud computing and distill it down into deploying applications and not worrying about issues like scaling and performance. The platform takes care of all that. Compare this to the nightmare of managing complex cloud infrastructures on AWS, and you can see why serverless platforms are gaining in popularity.

With database access on its way to being a solved problem, Cloudflare Workers takes another step toward largely eliminating the need for a traditional cloud platform for developers and businesses.

John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Timothy Green has positions in DigitalOcean. The Motley Fool has positions in and recommends Amazon.com, Cloudflare, DigitalOcean, Microsoft, and MongoDB. The Motley Fool has a disclosure policy.

The rest is here:
Cloudflare Is Fixing the Biggest Problem with Its AWS Killer - The Motley Fool