Category Archives: Cloud Servers

10 Best WordPress Hosting Companies (March 2023) – Unite.AI

In the digital era, having a reliable web host is essential for businesses and individuals alike. The right web hosting provider can make or break your online presence, particularly when it comes to WordPress websites, this is due to the fact that wordpress isnt always coded efficiently for speed, and a fast webhost can remove fractions of a second in loading speed, a core component of SEO, which results in higher search engine rankings.

In this article, we will explore the top 10 WordPress web hosting providers, explaining why they stand out from the competition and how they can benefit your website.

We believe in what we preach, which is why we host with our top recommendation SiteGround. Established in 2004, Siteground has long been a favorite among WordPress users due to its high-quality hosting services and excellent customer support. SiteGround is officially recommended by WordPress.org, which speaks to its reliability and performance. Key features of SiteGround include:

One of the most important aspects of ranking high in Google is page loading speed. Below is our SiteGround hosted wordpress website that receives nearly 2 million monthly visitors, and the metrics it receives in Google Page Speed Insights.

The above reasons combined ongoing positive experiences with customer support is why we recommend Siteground.

Bluehost is another hosting provider officially recommended by WordPress.org. With affordable pricing plans and robust features, Bluehost is a popular choice among both beginners and experienced users. Key features of Bluehost include:

ScalaHosting is a versatile hosting provider that offers managed WordPress hosting with an emphasis on speed and security. Key features of ScalaHosting include:

Skystra specializes in WordPress hosting and provides a user-friendly platform with a focus on performance and security. Key features of Skystra include:

A2 Hosting is a popular hosting provider that offers WordPress-specific hosting plans. With a focus on speed and reliability, A2 Hosting is an excellent choice for WordPress websites. Key features of A2 Hosting include:

Kinsta is a premium managed WordPress hosting provider that focuses on high-performance and expert support. Powered by Google Cloud Platform, Kinsta offers a fast and secure hosting solution for WordPress websites of all sizes. Key features of Kinsta include:

Hostingeris an affordable hosting provider that offers WordPress-specific hosting plans with a focus on speed and ease of use. Key features of Hostinger include:

Stablehost is a reliable hosting provider that offers shared and managed WordPress hosting plans. With a focus on customer satisfaction, Stablehost provides a solid hosting solution for WordPress users. Key features of Stablehost include:

Hostens is an affordable hosting provider that offers shared hosting plans optimized for WordPress websites. Key features of Hostens.com include:

DreamHost is a well-established hosting provider that offers a variety of hosting plans, including managed WordPress hosting. With a focus on performance and reliability, DreamHost is an excellent choice for WordPress websites. Key features of DreamHost include:

Each of these top 10 WordPress web hosting providers offers unique features, performance, and pricing to cater to different needs. When choosing a hosting provider for your WordPress website, consider factors such as your budget, technical requirements, and desired level of support. By carefully evaluating your options, you can select the best hosting provider to ensure a successful and secure online presence for your WordPress website.

Continued here:
10 Best WordPress Hosting Companies (March 2023) - Unite.AI

Jabra releases Elite 4 everyday earbuds with ANC for long-lasting … – iTWire

Audio, video, and collaboration solution provider Jabra has released its latest feature-packed earbuds, the Jabra Elite 4, bringing all-day ANC, multipoint Bluetooth, dust- and water-resistance, and many other features, all in a great package at a great price.

iTWire has long enjoyed seeing Jabra's extensive and excellent range of audio equipment including conference room setups, as well as headphones and earbuds, and the company is continuing to deliver on its proven 150-year history of audio engineering with this latest release, the Jabra Elite 4.

This tiny set of earbuds is the latest addition to Jabra's Elite line-up and is a step up from the previous Elite 3 model. It brings an incredible 22-hour battery life using the charge case - and that'swith ANC turned on! It's 28 hours with ANC off. And, as you might have guessed by that statement, it includes active noise cancellation, or ANC, which takes you out of the hustle and bustle of everyday life and into your own personal cocoon of peace as you immerse yourself in music or podcasts. Or, talk to your friends or colleagues too. Whether you're using these earbuds for work or play, Jabra has your back.

They also offer Bluetooth Multipoint so you can be connected to two different devices simultaneously. Listen to music on your laptop while connected to your phone for calls. Or the other way around; play music through your phone, while connected to your laptop for Teams or Zooms calls. Or your tablet and phone, or whatever combination you want. iTWire has been trying the Elite 4 out for ourselves and found not only are the earbuds comfortable and easy to fit, but we found the multipoint worked effortlessly and easily. The initial pairing was a cinch, with Jabra imbuing the Elite 4 with extra smarts in the form of Fast Pair and Swift Pair which connect and link your earbuds instantly to your Android 6.0 or higher phone or tablet, or Windows 10/11 laptop or desktop. It was quite magical how quickly that worked. iOS, macOS, and Linux users aren't left out in the cold though; regular Bluetooth pairing still works exactly as normal.

The ANC is impressive for such a tiny device. Jabra makes it work with feedforward ANC that filters out unwanted sounds. The buds include four microphones and 6mm speakers to ensure you're also heard loud and clear when making calls, with crystal clear sound. The companion Jabra Sound+ app allows you to further personalise sound qualities to your own liking.

The battery gives 5.5 hours of playtime by itself but extends to 22 hours using the charging case. As before, that's with ANC on and extends to 28 hours with ANC off. You can fast charge to get an hour of playback with 10 minutes of charge time. What's more, you can also go solo. The earbuds don't have to work as a pair, and in fact, allow you to use one while the other charges. Thus, you can literally keep playing and keep playing and keep playing with zero downtime.

The earbuds are also rated IP55 against dust and water; "IP" stands for ingress protection and that first number is for dust and the second for water. Both are rated out of 6, so the earbuds come in right near the top so you know they offer serious protection for pretty much any scenario you use them in - except deep sea diving, underground coal mining, or the most seriously dusty and wet conditions. In short, you can drop them on your floor, use them in the gym, or get caught in the rain, and your Jabra Elite 4 earbuds will effortlessly shake it off.

They come in Dark Gray, Navy, Lilac, and Light Beige, and are designed with premium durable materials. All-in-all this is an impressive package for everyday true wireless earbuds with ANC, and iTWire found them comfortable, effortless to use, and deliver crisp, clear audio with pumping sound when we wanted to enjoy our music.

Jabra SVP Calum MacDougall said,The modern earbud user is looking for tech thats ready for work and play at their fingertips, whilst not compromising on key features. The Elite 4 offers a solution to this and is the perfect all-rounder, designed to help users to concentrate, connect, and call without distractions, and is the ideal companion to balance work and life.

The Jabra Elite 4 is available in selected retailers at a retail price of $139 ($NZ 159) and includes a two-year warranty.

And, hey, if you're a Pokemon player, youknow you need earbuds called the Elite 4, while you smash the other Elite 4.

Read more:
Jabra releases Elite 4 everyday earbuds with ANC for long-lasting ... - iTWire

Exxact Joins Supermicro’s Test Drive Program to Enable Remote … – HPCwire

FREMONT, Calif., March 21, 2023 Exxact Corporation, a leading provider of high-performance computing (HPC), artificial intelligence (AI), and data center solutions, announced their participation in Supermicros GPU Test Drive program to trial accelerated development on NVIDIA H100 GPU-powered workloads. Exxacts customers can remotely test drive the capabilities of the newest flagship data center GPU the NVIDIA H100 80GB.

Potential customers can apply for the program through Exxacts Test Drive website. Once approved, customers can obtain remote access to a high-performance solution platform to test, benchmark, and qualify their advanced workloads with dual NVIDIA H100 80GB PCIe Tensor Core GPUs.

Exxact Corporation is collaborating with Supermicro to provide remote access to a powerful system, providing an excellent opportunity to prove its capabilities in accelerating workloads across a wide variety of applications with NVIDIA GPUs, said Jason Chen, Vice President, Exxact Corporation.

More about the Supermicro H100 Test Drive Platform

Experience the unprecedented boost in performance delivered by the Supermicro next generation 4U server with the new NVIDIA H100 Tensor Core GPUs through the Test Drive program. The new Supermicro system delivers exponential performance gains over the current generation of systems and is optimized for large language models, HPC, and various AI training workloads.

About Exxact Corporation

Exxact develops and manufactures high-performance computing platforms and solutions that include workstation, server, cluster, and storage products developed for Deep Learning, Life Sciences, HPC, Big Data, Cloud, and more. With a full range of engineering and logistics services, including consultancy, initial solution validation, manufacturing, implementation, and support, Exxact enables its customers to solve complex computing challenges, meet product development deadlines, maintain a competitive edge, and fuel the innovative minds of the world.

Source: Exxact

See the original post here:
Exxact Joins Supermicro's Test Drive Program to Enable Remote ... - HPCwire

White House to Regulate Cloud Security: Good Luck With That – Security Boulevard

Biden administration wants new regulations for cloud providers. But were not sure itll help.

Old people in suits propose new bureaucracyin an attempt to make IaaS, PaaS and SaaS more secure. Amid much tut-tutting about SolarWinds, they seem convinced they can make a difference.

The internet disagrees.In todays SBBlogwatch, we unpick the arguments.

Your humble blogwatchercurated these bloggy bits for your entertainment. Not to mention:Uptown Car.

Whats the craic? John Sakellariadis reportsBiden administration is embarking on the nations first comprehensive plan to regulate the security practices of cloud providers:

Cloud providers havent done enoughGovernments and businesses have spent two decades rushing to the cloud trusting some of their most sensitive data to tech giants that promised near-limitless storage, powerful software and the knowhow to keep it safe. Now the White House worries that the cloud is becoming a huge security vulnerability. If the government fails to find a way to ensure the resilience of the cloud, it fears the fallout could be devastating.For all their security expertise, the cloud giants offer concentrated targets that hackers could use to compromise or disable a wide range of victims all at once. And cloud servers havent proved to be as secure as government officials had hoped. Hackers from nations such as Russia have used cloud servers from companies like Amazon and Microsoft as a springboard to launch attacks. Cybercriminal groups also regularly rent infrastructure from U.S. cloud providers.Cloud providers havent done enough to prevent criminal and nation-state hackers from abusing their servicesofficials argued, pointing in particular to the 2020 SolarWinds espionage campaign. [And they] express significant frustration that cloud providers often up-charge customers to add security protections:Agencies that fell victim to the Russian hacking campaign had not paid extra for Microsofts enhanced data-logging features.

Maybe more from Matt Milano? Biden Administration Prepares to Regulate Cloud Security:

Cloud security lapsesTheres hardly any aspect of daily life that isnt touched by the cloud in some way. That ubiquity is a source of concern. [So] the Biden Administration now views the cloud industry as too big to fail.Unfortunately while companies have raced to deploy cloud platforms and services, cloud security has often lagged behind, leaving organizations and individuals vulnerable. Even worse, critical infrastructure has come under attack as a result of cloud security lapses.

Will it work? Stephen E. Arnold observes thuswiseBig Tech, Lobbyists, and the US Government:

Armies of attorneys

Heres what stood out to rdevsrex:

The Biden administrationwill require cloud providers to verify the identity of their users to prevent foreign hackers from renting space on U.S. cloud servers.

Wait. Pause. Joell do whatnow? Heres a slightly sarcastic u/ryosen:

Oh good. A bunch of septuagenarians that have demonstrated, time and again, that they lack even the most fundamental understanding of how technology works, are going to legislate how technology should work. Im sure this will be just fine.

And this Anonymous Coward is nonplussed:

Ignoring the hackers scarewording, actual foreign spies have no problem getting US identity cards. So this is zero protection.I dont buy for a moment that the POTUS with the best advisors US government dollars can buy dont know this. So its for another reason. And that reason is the same as why China demands every citizen register to online services with their government identity:To keep tabs on political adversaries.

This is fine. u/sometimesanengineer sips coffee amid the conflagration:

The US government doesnt understand cloud enough to properly regulate it. Ive seen enough stuff get past C3PAO to anticipate a meaningless designation getting applied that customers think absolves them of their piece if the Shared Responsibility Model. Same as weve seen with Azure Government or AWS GovCloud.Information has a tendency to be left off architecture and design documentation. Policies / procedures / practices claimed in controls compliance are not necessarily followed. Layers of the system or components of the system are often left out. And changes are made for expediency sake, often to fix something else thats brokenwhich in complex systems is a quick way to screw things up.

Lawmakers gonna lawmake. techno-vampire predicts pointlessness:

Let me guess:At least 75% of any new regulations will either require cloud providers either to do things or stop doing things that are covered by existing regulations. And, most of the remaining 25% will either be useless, or so ambiguous that nobody will be able to tell if any company is following them or not. Thats because the only point of creating these new regulations will be so that the Administration can claim that they did something.

Meanwhile, u/fractalfocuser laughs and laughs and laughs:

Ohhhh lord this is too funny. Quick everybody! Put the cat back in the bag!

Funk Wash!

Previously in And Finally

You have been readingSBBlogwatchbyRichiJennings. Richi curates the best bloggy bits, finest forums, and weirdest websites so you dont have to. Hate mail may be directed to@RiCHiorsbbw@richi.uk. Ask your doctor before reading. Your mileage may vary. Past performance is no guarantee of future results. Do not stare into laser with remaining eye. E&OE. 30.

Image sauce: DinkeyHotey (cc:by-sa; leveled and cropped)

Recent Articles By Author

Read the original:
White House to Regulate Cloud Security: Good Luck With That - Security Boulevard

BT picks AWS Wavelength for 5G and the cloud – Capacity Media

The partnership combines AWS's cloud with BTs 5G and 4G infrastructure. Specifically, EEs national mobile network with AWS Wavelength will bring AWS closer to the network edge delivering faster, secure and high-bandwidth connectivity for use cases like policing, crowd management, healthcare and security.

As we continue to build best-in-class 5G infrastructure for the UK, launching the AWS Wavelength service for our business and wholesale customers is a hugely important step on our journey bringing the power of the cloud to the UKs best network. Its set to unlock use cases like IoT cameras to help first responders keep communities safe: a real-life example of using tech to connect for good," said Alex Tempest, managing director for BT Wholesale.

By building cloud edge services into our 5G and 4G EE network, we can accelerate innovation across industries, and bring fast, secure data processing closer to where our customers need it most. Ultimately, we want to give businesses and public sector organisations all the power of edge computing, wherever they are.

The news forms part of BTs investment in its existing mobile networks, to enable 5G-connected infrastructure as a service via AWS Wavelength.

This includes switching on a new AWS Wavelength Zone in Manchester, which will service trials for eligible businesses and public sector organisations within a 100km radius, with plans to roll out AWS Wavelength to more business customers across the UK in the near future.

AWS Wavelength embeds AWS compute and storage services within 5G and 4G networks, providing mobile edge computing infrastructure for ultra-low-latency applications. Hosting services directly at the edge of EEs UK network reduces lag, as application traffic can reach application servers running in the AWS Wavelength Zone without leaving BTs network.

Read this article:
BT picks AWS Wavelength for 5G and the cloud - Capacity Media

Devoli bucks cloud migration trend with hybrid on-prem move to HPE – Reseller News

Craig Murphy (Hewlett Packard Enterprise)

Network automation vendor Devolihas bucked the trend of cloud migrations, instead opting fora move to a hybrid on-prem solution powered by Hewlett-Packard Enterprise (HPE).

Three years ago, Devoli had three core platforms that supported all voice applications and network management systems. With Dell servers and many of its monitoring and backend systems sitting with AWS, the systems and equipment were nearing end of life.

We run telco applications that require very high bandwidth and found that AWS couldnt keep up with the necessary processing power, said Ken Nicod, networking director at Devoli.

Also of concern was the increasing costs involved with a voice network that makes lots of small processing transactions. With public cloud no longer a long-term viable option, Devoli looked to create its own.

Searching for a solution that would allow Devoli to continue to support its customers seamlessly, Nicod was in the market for a full five-year engagement and a partnership with end-to-end service.

Devoli had an existing relationship with Ingram Micro, who put forward HPEs Alletra dHCL product, which includes HPE Storage and HPE ProLiant Servers, that could provide streamlined infrastructure and optimised performance.

For Craig Murphy, general manager of channels at HPE, moving to on-prem solution was an interesting scenario given the current wider trends of migrating to the cloud.

With everyone saying that mass migration to the cloud is the solution to everything, we are saying, No, not quite whats fit for purpose? Where does the data need to reside to get the best outcome? he said. We have come to the realisation that mass migration to the cloud is not the answer hybrid models is the way to go.

Were in a hybrid world and HPE has positioned itself to be the perfect partner for that. It doesnt matter what cloud youre in we can interact with it and we can also have an on-prem or co-load solution at the same time. Thats the way of the future for us."

With a partnership set in motion, this would be the first time such a solution had been employed in the Asia Pacific region.

Theres a lot of drive to go to someone elses cloud but not as much to create your own, Nicod said.

Because of that, this engagement was exciting and different from most projects. We gained direct access to the HPE team, including their engineers.

The delivery of the project occurred in the height of the COVID-19 lockdown, presenting the additional challenges of masking and keeping engineers in bubbles. Following an eight-week deployment process, 68 out of 74 applications were migrated to the private cloud in one day.

It was really tough times we were in the thick of COVID, we had our deployment guys in full PPE with lots of strong protocols around them and they had to be in discrete bubbles. Managing that, as well as trying to get the supply chain to function, was a challenge, Murphy said.

Ingram Micro facilitated a lot of that side of things for us, so it was a great success from a partnership perspective with the three of us involved it all converged for us.

Devolis network infrastructure and security team now runs entirely on the private cloud, with the exception of a few monitoring and alert systems in case the platform fails.

The software side of the business still operates through AWS due to flexibility for the development team, Nicod said, but over the next three to six months intends to migrate as many applications as possible to the private cloud.

While the number of platforms has reduced from three to one to serve both Australia and New Zealand, Nicod says reliability has been improved and costs have been reduced by 50 per cent in migrating to the private cloud.

Other notable outcomes include a 50 to 75 per cent improvement in the time to deploy support applications, provision new workloads and deploy new compute and storage resources. Devoli has also seen a 25 per cent improvement in lifecycle management tasks, less time troubleshooting issues and automated capacity planning that makes the teams routines significantly faster.

Devolis differentiator is our ability to deliver any broadband voice service within a few hours not weeks or months, Nicod said.

We have some large customers who consume a lot of voice traffic. Now we have the enhanced reliability to move faster, which has garnered us even more business.

Error: Please check your email address.

Tags HPEDevoli

Read more:
Devoli bucks cloud migration trend with hybrid on-prem move to HPE - Reseller News

Living with data breaches in unregulated cyberspace – The Express Tribune

ISLAMABAD:

Data fusion, cloud computing and internet-enabled devices have brought us the greatest threat since the Cold War: the risk of cyber-attacks from proxy states.

With Pakistans public sector institutions frequently being attacked by the terrorist adversaries operating the anonymous TOR networks, it is becoming critical to train and organise a cyber force to assist the government in managing the escalation in case of a cyber conflict.

Recently, LeakBase accessed the consumer data of Paysys Labs, an intermediary that integrates SBPs Raast services through its middleware, and published data of more than 50,000 users on the dark web.

Philippine Cyber Alliance has attempted to attack over a dozen government websites this month; not to mention some cyber terror group that has published personal details of the Punjab government employees.

Data of many private companies such as AM International and medIQ has reportedly been released on hacker forums.

It is crystal clear that individuals, businesses, and local governments cant bear this additional burden of ensuring cyber security and this domain must be dealt by specialist organisations with niche technology to safeguard from these attacks. Not only short-term defensive measures are required urgently, but there is also a need to take a strategic approach to build resilience in IT systems.

What it means for policymakers is to isolate database systems from each other, wherever possible, and avoid funding programmes that lead to data fusion. For example, integrating NADRA, FBR and banking systems is too dangerous, though such an integrated system offers a dream dashboard for authorities.

Though their individual APIs are secure, the architecture inherently promises too much power for hackers.

Similarly, the fact that NTDC has online dashboard available, which could be manipulated by any malicious user, is prone to attack incidents in the entire electricity supply chain. Russia has been attacking Ukrainian infrastructure including power grids and banks for a decade now.

Tracking such an attack or locating the cyber terrorists is tricky. A Russian hacker, over a VPN running in the US, may be using phishing emails to install malicious software in computers connected to our governments intranet for stealing data by uploading it on a Chinese cloud server.

Using the US as a proxy to launch attack while collecting data on another server in China makes it difficult to geolocate such individuals. Tracking people in cyberspace becomes a jurisdictional nightmare, making cyber warfare a weapon of choice for ransom groups.

With multiple elections due to be conducted this year, adequate cyber security measures need to be taken timely as many countries have cyber weapons to influence election results as well as public opinion.

By leveraging social media platforms run by Meta, it is very easy to use behavioural tools along with targeted ads to influence public sentiment on lines of Cambridge Analytica.

What we need to do is to create awareness for promoting data privacy and best practices to handle the online public data at large. Resilience of our critical infrastructure and essential services must be the top priority and strict SOPs need to be built into the system.

Cyber audits need to be conducted by the concerned regulatory authorities of critical assets including the banking system and security auditors need to thoroughly review protocol stacks and software components every quarter; building a list of every components license, patch releases, and dependencies.

So, in case a particular software component gets compromised, all organisations whose IT systems were built using that component could be timely alerted.

However, cyber security can also lead to less ease of doing business as a stringent SOP can slow down the clock speed of commercial operations.

For example, if NADRA stops issuing online ID cards and other certificates, fearing that fingerprints and signatures could be leaked and forged on illegally issued stamp papers to seal fake contracts, the inconvenience caused to an ordinary citizen will be enormous.

Similarly, the IoT devices that are penetrating quickly among the masses to control household appliances remotely are a great convenience but unfortunately all our data also get dumped into servers located overseas.

In a worst-case scenario, our electrical appliances could also be controlled by any foreign cyber terrorist or a ransomware group.

Overall, the future of cyber security will require continued investment in latest technologies and approaches to keep pace with the evolving threats.

However, surveillance of citizens on the pretext of cyber security must be discouraged and a policy shift towards cyber security that ensures minimal infringement on citizens rights of privacy is needed while taking holistic security measures.

The writer is a Cambridge graduate and is working as a strategy consultant

Published in The Express Tribune, March 20th, 2023.

Like Business on Facebook, follow @TribuneBiz on Twitter to stay informed and join in the conversation.

See the original post:
Living with data breaches in unregulated cyberspace - The Express Tribune

Industry Insights: What will the newsroom of tomorrow be like? – NewscastStudio

Subscribe to NewscastStudio's newsletter for the latest in broadcast design, technology and engineering delivered to your inbox.

Broadcast production vendors recently participated in an Industry Insights roundtable discussion on newsroom technology, looking at the current pain points and where the tech stack will help in the future.

The roundtable participants envisioned a newsroom of the future that is adaptable, diverse and more data-driven, with a focus on automation, collaboration, and integrated AI services. The workflow of the future will be streamlined, highly integrated, consolidated and agile, with greater emphasis on automation and collaboration.

Luis Fernandez, senior product marketing manager, Dalet:Newsrooms of all sizes will continue to face the challenge of catering to a wide and dispersed audience across various digital and broadcast platforms. Each platform, and their unique audiences, have different expectations and ways to understand the story, which makes it difficult for newsrooms to plan, produce and distribute stories effectively and efficiently.

Miro Rusko, managing director APAC, Octopus Newsroom: I see two verticals, speed where the professional newsrooms aim to deliver speedy and verified information competing with opportunistic social media accounts seeking exposure thus publishing information without verification. Second is the IT cyber-security policies, which are in some cases obstructing the concept of working from anywhere.

Craig Wilson, product evangelist in broadcast and media enterprise, Avid:One of the biggest pain points for news organizations is ensuring effective collaboration between teams to efficiently produce and deliver compelling content across multiple platforms. Of course, another pain point is the usual resistance to change from editorial teams to adapting to new workflows and technologies.

Adam Leah, creative director, Nxtedition:One of the main challenges that newsrooms are currently facing is the use of outdated software, complicated workflows, and a shortage of skilled professionals. The tendency to rely on traditional methods and resist change can impede the newsgathering process and make it difficult to keep up with changing trends. However, by embracing new technologies and letting go of old habits, newsrooms can overcome these challenges and adapt to the current demands of the industry.

Ionut Johnny Pogacean, senior product manager, Vizrt:Some of the most significant pain points are the monetization of content, the ability to produce and deliver multi-platform stories with speed, and the balance of retaining brand identity while trying to match the language of the platform they are published to.

Jenn Jarvis, product manager, Ross Video: Visibility of information whether that is information on what stories are in progress or information from a source. With newsrooms producing more content than ever before and from more locations, lack of visibility is what leads to redundant efforts, mistakes and general frustration.

Luis Fernandez:Cloud-based tools enable newsrooms to transcend physical boundaries and be accessible from any location. With cloud-native solutions like Dalet Pyramid, news professionals can access the same technology inside and outside the newsroom with ease and familiarity. This seamless hand off helps journalists break the news faster, work more collaboratively, and access all assets, communication and production tools from wherever the action occurs.

Gianluca Bertuzzi, sales manager for Africa and Latin America, Octopus Newsroom: Cloud-based tools are helping the newsroom by providing efficient and cost-effective ways to store, share, and access large amounts of data and multimedia content. They also enable remote collaboration and streamline the workflow.

Craig Wilson:Cloud-based tools are enabling collaboration, whether its working from the field or from other remote offices. These tools also enable access to integrated AI services to supplement technical metadata and assist in the editorial process.

Adam Leah:There is some nuance to the definition of cloud technology, as it can refer to both public cloud and private cloud servers on-premise. I understand that as cloudflation affects the cost of cloud services, many in the industry are considering alternatives to mitigate these expenses. However, its important to remember that the focus should be on the technology itself and how it can benefit newsrooms, rather than the physical location of the servers.

Johnny Pogacean:There is some degree of familiarity with working with web tools that makes things easier and more approachable. It is vital for todays journalists to go live from anywhere, and cloud tools allow the journalist to be as close to the story as possible without having to remote in on-prem resources. In addition to that, another benefit is the quick updates that SaaS providers offer.

Jenn Jarvis: Centralizing information and collaboration in a single tool or set of tools is changing the way newsrooms work. And putting those tools in the cloud creates a consistent and cohesive workflow regardless of location. Journalists have always had to work on the go, but weve only recently gotten to a point where a remote workflow mirrors the same experience as working in the newsroom.

Luis Fernandez:Some newsrooms are more equipped than others, but the truth is, with time, keeping up with the large variety of content will become more and more needed. The role of AI in this matter is critical; how can newsrooms generate all the metadata required for discoverability, repurposing, distribution, and archive? In conjunction with collaboration between different locations and broadcast & digital teams, this is the second most mentioned concern.

Bob Caniglia, director of sales operations in the Americas, Blackmagic Design: To keep up with the large demand for content across a variety of platforms, ranging from long-form video to social media snippets, newsrooms need to invest in flexible, all-in-one tools that support all types of content production needs, while also simplifying workflows.

Gianluca Bertuzzi: Newsrooms are equipped to handle a multitude of content, but it can still be a challenge to keep up with the demand for multimedia and interactive content, as well as ensuring that all content meets the high standards for accuracy and impartiality.

Craig Wilson:There is a big demand to quickly produce a lot of quality content while tailoring and rapidly delivering it to different platforms. Today most newsrooms can either produce good content or can produce it quickly, and often need to find a compromise such as a combination of skills training for staff and tools which can deliver content to any platform are needed.

Adam Leah:Not very many newsrooms are equipped to handle the plethora of content and platforms required in todays fast-paced news environment, they are too linear led. The demands for content across different platforms and formats are constantly changing and traditional newsroom installations are struggling to keep up with the pace. The lack of agile workflows, and modern technologies, along with the industry-wide skills shortage, only exacerbates this issue.

Johnny Pogacean: While most cope well with gathering content, what happens after it varies significantly depending on the size of the newsroom and resources. Its not uncommon for broadcasters to simply clip their on-air content and publish that, but that means compromising on quality. Increasingly distributed newsrooms and audiences wanting news on-demand on their preferred platforms are prompting newsrooms to adopt a story-first approach. The term story-centric is used a lot in our industry for workflows that are organized around the story, but how that looks in practice varies greatly.

Jenn Jarvis:Some more than others. The larger organizations are investing the time and energy into analyzing and building multi-platform workflows while smaller newsrooms are often struggling to create the same content without the integrated tools. The biggest challenge for all is the rate at which the content strategies and publishing platforms are changing.

Luis Fernandez:Newsrooms will get more complex with time, as new social and digital media outlets emerge and methods for reaching audiences and telling stories evolve. Newsrooms are already evolving from the linear model, focused on broadcast, and developing a more story-centric approach powered by new tools, workflows, and resources with specialized skills.

Bob Caniglia:The newsroom of the future will be adaptable and supported by powerful, hybrid technology, but most importantly, it will be diverse as professional technology is no longer reserved for the big broadcasters. With the continued adoption of accessible virtual technologies and cloud tools, creators will collaborate from anywhere in the world and will be unconstrained by one fixed studio or location.

Gianluca Bertuzzi: The newsroom of the future is likely to be more data-driven and technology-focused, with an emphasis on automation and collaboration. The use of artificial intelligence and machine learning is likely to increase, allowing journalists to focus on more in-depth reporting and storytelling.

Craig Wilson:The newsroom of the future provides a creative story centric approach to writing and content creation, while enabling access to material regardless of its location. This newsroom will require integrated collaboration tools for planning, creating, tracking and distributing content to multiple platforms, and integrated AI services to aid journalists and editorial teams with their work.

Adam Leah:The future requires us to be more pragmatic and forward-thinking. If we use the new technologies in the same way we used the old technology we will never release their full potential. Then there is all the AI stuff which is great for transcription, translation subtitling and indexing content but there may be moral and political issues around facial recognition and synthetic media; its technologically achievable, but is it moral? Thats going to be an interesting future debate.

Johnny Pogacean:AI (for better or worse) will revolutionize how content is created, processed, distributed and consumed. As journalists become more multi-disciplined, they are expected to do a lot more. The tools journalists use have to evolve to match their needs as complexity is just moved and managed differently; it doesnt disappear. Efficiency and accessibility will become even more critical in the future.

Jenn Jarvis:Whats exciting to me about the next generation of journalists is their general comfort level with technology and the rate at which technology can change. They are well positioned to adapt as delivery platforms and audience consumption changes. I think were going to see responsive newsrooms that are willing to experiment with new approaches and content formats.

Luis Fernandez: The newsroom workflow of the future will be hybrid, orchestrated, and intelligent, as the need and context in which newsrooms operate continue to evolve. Teams will need to be able to work collaboratively and effectively regardless of the consumption platform or the work location, and their workflows will need to be able to visualize, manage, assign, communicate, media edit, and distribute stories fast to different audiences on different platforms and be able to make a real impact.

Bob Caniglia:The newsroom workflow of the future will be streamlined as todays integrated and collaborative technologies empower creators to do more with less. Talent from all over the world will be able to create and share content in real time with their colleagues and newsrooms, contributing a diverse range of ideas and content.

Gianluca Bertuzzi:The newsroom workflow of the future is likely to be more streamlined, with a greater focus on collaboration and the use of technology to automate routine tasks. This will free up journalists to focus on more strategic and creative work.

Adam Leah: An exciting development were working on is story versioning. With the need to cater to different age groups and various social media platforms, the ability to fork a story using ML into multiple versions is a crucial asset. Another key requirement will be speed, speed is of the essence in breaking the news to the audience. To accomplish this, a highly integrated, consolidated, and agile workflow will be a necessity, thereby ensuring a seamless journey for the story from ideation to the viewer. Both points will need a technological step change in the newsroom.

Johnny Pogacean: I expect that content will become more interactive and highly individualized. Imagine content being augmented and enhanced by AI, users will choose their preferred style, choose the amount of graphics theyll see in a story. I also suspect that services like weve seen in the last few months with ChatGPT will become ubiquitous and they will be leveraged to deliver content in a highly individualized manner, and provide the necessary context in a way that is more approachable and understandable to each individual, without the newsroom having to generate it all.

Jenn Jarvis:Were already seeing investment priorities shift to things like planning tools, asset management and analytics. The workflow of the future is going to be ecosystems where these tools are connected and cohesive. Many of the manual workflows we have today will be automated, but visibility of content and data will play important roles and how those automated workflows are built.

Subscribe to NewscastStudio for the latest delivered straight to your inbox.

View original post here:
Industry Insights: What will the newsroom of tomorrow be like? - NewscastStudio

How memory management is key to scaling digital twins in the cloud – Diginomica

Scientists have been building supercomputers for simulating climate change for decades on special-purpose machines using specially crafted algorithms. Today powerful cloud computers are growing in compute and raw memory capacity for running industrial simulations. However, some consideration must be given to how this memory is managed to get all these different models to work together.

Memory issues may not be the first thing that comes to mind as enterprises and researchers build ever-larger digital twins. Memory issues could become more significant as enterprises and researchers push the limits of larger models for adaptive planning scenarios like climate resiliency or building better products. The big challenge comes with predictive and prescriptive analytics designed to tease apart the knock-on effects of climate change on businesses and regions.

Building more accurate models means increasing the resolution and types of data. But this can also create hiccups that can stall models required to test various scenarios. This can be significant when running dozens or even hundreds of models to tease out the impact of multiple strategies or assumptions. In many of the largest models today, built-in programming languages like C require a lot of hand-tuning to help free up the memory requirements. Meanwhile, programming languages like Java, with ambitious memory management capabilities, could be vital in building more extensive and flexible digital twins.

Maarten Kuiper, Director of Water International at Darieus, a civil engineering firm based in the Netherlands, has been developing ever larger digital twins for planners, farmers, citizens, and business plans for climate change. In some respects, the Netherlands has been on the front lines of climate change for decades with ambitious efforts to protect low-lying lands from rising seas.

These days, Kuiper is helping plan against new combinations of floods and droughts. During a flood, it may be tempting to try and run all the water out to sea as quickly as possible, but then groundwater loses a valuable buffer against salt water running in. He was an early adopter of digital twin simulation tools from Tygron that allowed him to combine and overlay data sets about land elevation, hydrological conditions, land values, and demographic conditions.

The software also makes mixing and matching models from different sources easier. For example, he finds the latest tree models do a better job at modeling a trees ability to suck up water, its impact on nearby structures, and how they are affected by wind and elevation.Kuiper says:

Many people look at trees from different angles. You need to bring all those people together to make better decisions. With water security and climate change, we must bring citizens, governments, and businesses together.

Digital twin frameworks make it easier to bring new data sets, models, and visualizations for different use cases. A business might want to see how flooding or, conversely, lands subsiding might impact shipping routes, compromise the integrity of facilities, or affect supply chains. For example, the Port of Rotterdam used the same software to help plan a massive port expansion. This allowed them to align investment with new expansion with returns to guide profitable growth.

A big challenge is bringing more data to bear on better predictions and recommendations for planners. Kuiper explains:

We were early adopters. It started with a great visualization. But then we also need calculations for all kinds of simulations in their own domain. For example, we might need to calculate groundwater levels when the rain falls or what happens with a heat event. We needed software that could combine all those simulations in real time since the results are interconnected. This has helped us integrate analysis with all kinds of stakeholders who might be looking at something from different angles. It was also important to have information quickly in case of a disaster.

For example, in the wake of a flood, adding a relatively small earth bank in the right place can help adapt much better than a larger change elsewhere. A fast digital twin allows them to calculate all sorts of scenarios before acting in the real world. It also allows them to evaluate dynamic actions.

These larger digital twins would not have been possible with better memory management. Maxim Knepfle, CTO of Tygron, started working on the platform shortly out of high school. He adopted the Java programming language to strike the right balance between speed and performance. But he started running into long pauses as these digital worlds grew. Past a certain point, the simulations would pause for an extended period, which kept the simulations small or course. He had to keep the size of each grid cell about twenty to thirty meters on a side, which also limited the accuracy and precision of the models. Knepfle says:

In those large data sets, the normal Java virtual machine would freeze for about two or three minutes, and your entire application would freeze.

While at the Java One conference, he stumbled across Azul, which was doing cutting-edge work in building more performant garbage collection into the Java runtime. He tried the new runtime, which cut the pauses to several milliseconds versus several minutes. This enables his team to scale the latest models past twenty terabytes to support grids as small as twenty-five cm on a side with over ten billion cells.

Even with the explosion in new languages, Knepfle is still a big fan of Java since it is faster than Rust or Python and automates the underlying resources better than languages like C++. This is important in building better digital twins since they want to be able to bring in the latest algorithms and have them run quickly. This becomes a problem when the data sets become big,

Scott Sellers, CEO and co-founder of Azul, says that memory sizes available to work with have been growing thanks to cheaper memory and improvements in x86 architectures that give programmers access to more memory:

We would not have been able to do it without Moores Law allowing more memory to be put into boxes and without help from Intel and AMD adding hooks in the microprocessor to tap into terabytes of memory. Five years from now, we will talk about maybe half a petabyte of memory in a physical box.

This is taking what used to be done on a supercomputer and enabling it in the cloud, which makes a lot of sense. Instead of building these $300 million data centers and populating them with expensive servers, we can replace them with lower-cost servers in the cloud.

The rapid advances in GPUs are paving the way for building ever-larger digital twins for industrial design, planning, predictive analytics and prescriptive analytics. Increasingly these models will require running calculations across different types of data in parallels. For example, engineering and design teams are turning to multi-physics simulations that help identify design changes on mechanical, electrical, and thermal properties.

Other realms might similarly combine different kinds of economic, weather, demographic, and geologic models to adapt supply chains, plan expansions, or mitigate climate risks. Exploring multiple scenarios could require running lots of variations. Developers will need to consider the impact of allocating memory in creating these larger models at scale.

Read the original here:
How memory management is key to scaling digital twins in the cloud - Diginomica

Launch of Northrop Grumman-built SES-18 and SES-19 represents … – iTWire

Global content connectivity solutions SES has announced that its SES-18 and SES-19 satellites, designed and assembled by Northrop Grumman, were successfully launched by SpaceXs Falcon 9 rocket from Cape Canaveral Space Force Station in Florida, United States.

SES says the satellites were launch at 7:38 pm local time on Friday, March 17.

The two American-made satellites are the fourth and fifth and final satellites to be launched as part of SESs C-band transition plan, following the launch of SES-22 in June 2022 and the tandem launch of SES-20 and SES-21 in October 2022, notes SES.

These satellites are essential parts of SESs plan to achieve the Federal Communications Commissions (FCC) program to clear C-band spectrum to enable wireless operators to deploy 5G services across the contiguous U.S. (CONUS) while ensuring that SESs existing customers continue to enjoy uninterrupted TV, radio, and critical data transmission services to millions of Americans.

Since 2020, SES, along with other satellite operators, has been clearing 300 MHz of C-band spectrum and transitioning customer services to the remaining allocated 200 MHz of spectrum by launching new satellites, building new ground stations and sending hundreds of satellite earth station technicians across the country to install new filters on customers antennas.

By providing contractual service protections to customers who receive video services in the U.S., SES-18 and SES-19 will enable SES to safely clear C-band spectrum to help accomplish the FCCs ambitious goals for American 5G innovation. SES-18 is expected to begin operations in June 2023 at 103 degrees West replacing SES-3 C-band payload and SES-19 will be co-located with SES-22 at 135 degrees West, SES said.

This successful launch marks one of the last remaining milestones on our journey to clear a portion of the C-band, and we are incredibly grateful to Northrop Grumman, SpaceX, and all of our partners who helped make this plan a reality, said Steve Collar, CEO of SES. We are now on the home stretch in protecting our customers broadcasts while freeing crucial 5G spectrum and we look forward to successfully concluding our work well before the FCCs December 2023 accelerated clearing deadline.

SES notes that more information on the SES-18 and SES-19 satellites can be found on the SES C-band in the U.S. newsroom.

Reducing WAN latency is one of the biggest issues with hybrid cloud performance. Taking advantage of compression and data deduplication can reduce your network latency.

Research firm, Markets and Markets, predicted that the hybrid cloud market size is expected to grow from US$38.27 billion in 2017 to US$97.64 billion by 2023.

Colocation facilities provide many of the benefits of having your servers in the cloud while still maintaining physical control of your systems.

Cloud adjacency provided by colocation facilities can enable you to leverage their low latency high bandwidth connections to the cloud as well as providing a solid connection back to your on-premises corporate network.

Download this white paper to find out what you need to know about enabling the hybrid cloud in your organisation.

DOWNLOAD NOW!

Marketing budgets are now focused on Webinars combined with Lead Generation.

If you wish to promote a Webinar we recommend at least a 3 to 4 week campaign prior to your event.

The iTWire campaign will include extensive adverts on our News Site itwire.com and prominent Newsletter promotion https://itwire.com/itwire-update.html and Promotional News & Editorial. Plus a video interview of the key speaker on iTWire TV https://www.youtube.com/c/iTWireTV/videos which will be used in Promotional Posts on the iTWire Home Page.

Now we are coming out of Lockdown iTWire will be focussed to assisting with your webinars and campaigns and assistance via part payments and extended terms, a Webinar Business Booster Pack and other supportive programs. We can also create your adverts and written content plus coordinate your video interview.

We look forward to discussing your campaign goals with you. Please click the button below.

MORE INFO HERE!

Excerpt from:
Launch of Northrop Grumman-built SES-18 and SES-19 represents ... - iTWire