Category Archives: Cloud Servers

Dive into the history of server hardware – TechTarget

Millions of servers exist worldwide, many of which are hidden away in server farms and data centers. But have you ever considered the technology's origins? The history of server hardware is fascinating, because it reveals just how rapidly the technology evolves and its role in data center development.

Before a review of the history of servers, it's important to understand what a server is and does. In its most basic form, a server is a computing program or device that provides a service to another computing program or device, also known as the client.

The service device performs a range of tasks from sharing hardware or software resources with a client to securely moving files between computers.

Servers are built with powerful hardware processing, memory and storage components, but the type of service a server provides is what separates it from the average computer program, rather than the hardware that makes up the machine.

Today, there are many types of servers, such as application servers, proxy servers, file servers, policy servers and virtual servers. Though, the most notable event in server hardware history begins with the invention of the world's first web server in 1990.

In 1989, Tim Berners-Lee, a British engineer and computer scientist, invented the World Wide Web at CERN. He developed the World Wide Web to meet the need for automated information sharing between scientists across the world.

By Dec. 25, 1990, Berners-Lee had set up the world's first web server on a NeXT computer; the device had 2 GB disk, a gray scale monitor and a 256 MHz CPU. The server is still at CERN, and has a big white label attached to the front of the machine that reads, "This machine is a server. Do not power it down!"

The first webpage ever created contained informational links about the World Wide Web project and the technical details of web server creation.

In December 1991, the first web server outside Europe was installed in California at the Stanford Linear Accelerator Center, and by late 1992, the World Wide Web project had expanded to include a list of other web servers available at the time.

In 1993, CERN put the World Wide Web into the public domain, which catapulted its growth and progress, and over 500 known web servers existed worldwide in December 1993. Server use continued to multiply. At the end of 1994, over 10,000 servers with more than 10 million users were established.

The proliferation of server technology led to the development of rack-mounted servers, the first of which was Compaq's ProLiant Series, released in 1993. A rack framework comprises multiple mounting slots, each designed to hold a server. Because a single rack can hold and stack multiple servers, less space is needed to store the machines.

This helped organizations fit even more servers into smaller spaces; however, keeping a rack in a confined area leads to excessive heat buildup and required specialized cooling systems to maintain optimal temperatures.

Around this time, companies acquired more technology and moved all servers and equipment into singular rooms, colloquially referred to as server rooms. These rooms were unused or old areas within the company's walls. Eventually, organizations started to design rooms specifically for servers and address temperature monitoring and security issues. This change in infrastructure paved the way for the modern data center.

In 2000, Christopher Hipp and David Kirkeby applied for the blade server patent. One year later, the first commercially available blade server was available from RLX Technologies, the company where Hipp and Kirkeby worked.

Blade servers were a step forward in the history of server hardware because they addressed several limitations of the rack-mounted server framework. Blade servers use fewer components than rack servers to minimize power consumption and save space.

Blade servers also fit within a blade enclosure, or a chassis, which can hold multiple blade servers at once. A blade enclosure can provide a variety of functions, such as cooling and networking hardware, and each enclosure can be rack mounted.

With blade servers, the technology became smaller but just as powerful, and companies could increase the density of dedicated servers within a data center. The benefits of blade servers resulted in a massive increase in efficiency and enabled organizations to use computing resources more effectively and strategically.

After the blade server's invention, focus turned from new hardware creation to management for better performance and efficiency. For example, server clusters provide users with higher uptime rates. Server clusters are a group of servers that are connected to a single system. If one server experiences an outage, the data center transfers the workload to another server and avoids any downtime on the front end.

Out-of-band management, also known as remote management or lights-out management, also moved onto the scene. With lights-out management, an IT team could manage, manipulate and monitor servers without even physically stepping into a data center. This method of remote server management further improves efficiency and reduces the number of IT administrators required for server room management.

In 2013, HP Labs developed Moonshot, the world's very first software-defined server. Compared to traditional servers, Moonshot servers run on low-energy microprocessors and use less energy and space. These servers were designed to handle specific data center workloads, such as massive amounts of information and high-performance cloud computing.

Around this time, a new trend started to become popular: virtualization. A virtual server, or a cloud server, has all the capabilities of a hardware-based server but includes virtualization software to divide a physical server into multiple virtual servers. Virtual servers are good for highly variable workloads, so organizations that have fluctuating needs might prefer the flexible scaling provided by cloud servers; the technology takes away much of the physical server management requirements.

As data centers grow to address more diverse IT infrastructures, servers must evolve to meet increased demands in volume, performance and efficiency. The worldwide server market continues to grow, with a reported revenue of $25.4 billion in 2019, according to IDC.

Server hardware will likely become smaller, more compact and even more simplified, with a big focus on virtualization. It will be an interesting space to watch, and if the history of servers can tell us anything, the next big step forward is already in development.

Continue reading here:
Dive into the history of server hardware - TechTarget

NextCloud gets bigger and better with Nextcloud Hub 19 – ZDNet

I've used Nextcloud, a great open-source Infrastructure-as-a-Service (IaaS) cloud, for years on both my own on-premise and shared servers. It works well, it's simple to set up, and it does the main job of replacing public cloud storage services such as Google Drive, Microsoft OneDrive, and DropBox flawlessly. With this latest edition, Nextcloud Hub 19 is also adding improvements to its built-in, office Software-as-a-Service (SaaS) programs.

Frank Karlitschek, Nextcloud's CEO, explained this release is named "home office," because "COVID has accelerated a trend already visible in many businesses, forcing them to provide a secure remote collaboration solution to their employees. We named our release "home office" as it provides a large productivity boost for organizations employing many home office workers, like we do ourselves."

New Nextcloud Hub features add document collaboration to video chats, massively simplify authentication, and greatly improve performances.

How "greatly?" It's everywhere. Here are three examples. If you're using ftp, you can read files up to 500% faster. You can also scan files up to 2.5x faster. And, you can image thumbnails from 25-50% faster in LibreOffice OpenDocument files and Krita paint files.

Nextcloud is also moving on from password security to WebAuthn. This is an emerging password-less secure-login standard. With it, you can use Nitrokey open-source, FIDO2 compliant, hardware security keys,Windows Hello, and other hardware keys. It also offers a choice of other authentication options with stronger account security than the usual -- and so often busted -- username/password combination.

This release of Nextcloud Hub also has other new security features. These are:

Nextcloud also has its own built-in video-conferencing and group meeting service: Nextcloud Talk. Here, the most interesting new feature is you can now edit office Collabora documents during video calls or from within a chat room. Collabora is a SaaS version of the popular open-source LibreOffice office suite.

While Nextcloud and Collabora have long worked hand-in-hand together, Nextcloud Hub 19 is the first in which Collabora Online is shipped with it ready to run. This is a special community server version, which dramatically eases its installation. Having done it myself, this is a real blessing. Installing them together manually isn't trivial.

Unfortunately, as Nextcloud admits, "This version however sacrifices scalability for this ease of installation and is not suitable beyond private use. We recommend the existing Collabora Office solutions for that, with docker images for small offices and more scalable solutions available for enterprises through Nextcloud."

I found it worked well for personal use, but I wouldn't push it much further than that. While it's nice that it's there, it's a shame it doesn't go any farther. Even a small-business, plug-and-play version, which could handle say five users, would be really useful for micro-businesses.

Another interesting Talk mini-feature is it now automatically scales call quality depending on users' bandwidth. The program has also gotten its share of performance and scalability improvements The net result is people on Talk with limited internet connections can still participate successfully in a group conversation

The Nextcloud Hub user interface has also been improved to make it much easier and faster to find, share, and co-operate on work files. Under the hood, this release also introduced several significant improvements:

Put it all together and you get the best all-in-one private cloud package available today. Sure, there are more complex open-source clouds, which can do more in particular areas, such as OpenStack and Cloud Foundry. But, if what you want is a straight-forward business cloud, which you, and not some vendor, controls, then Nextcloud is what you want.

Related Stories:

Go here to see the original:
NextCloud gets bigger and better with Nextcloud Hub 19 - ZDNet

Micron Has the Potential to Rise 50% From Here – TheStreet

Micron (MU) - Get Report remains out of favor as investors shy away from its stock which has been incredibly volatile.

However, its underlying performance continues to point in the correct direction, with its key markets -- data servers and smartphones -- showing strong tailwinds.

Once its operations move out of the slump they've been in and its free cash flow returns to former highs set in 2018 of $9 billion, even with a 10x multiple to free cash flow would see Micron being valued at close to $90 billion. That would represent at least 50% upside potential.

Micron sells memory chips, which are commoditized products. They sell NAND memory and DRAM memory. NAND memory is the most commoditized memory type and has the poorest margin profile. DRAM has significantly better margins and makes up approximately 66% of total revenue, with only three global players, Micron being one of them.

In essence, DRAM memory is a high performance, high-speed data retrieval memory, and is aimed mostly at cloud server, enterprise, smartphone and networking markets.

The other memory type Micron sells is NAND, which is a low-cost storage solution, such as home hard drives. This is a smaller part of Micron and less meaningful to the overall thesis here.

Micron sells into these key verticals: approximately 27% towards servers, 25% to smartphones,20% into PCs, and20% towards automotive uses (this includes industrials and other applications).

Thus, even if servers are performing strongly at present until smartphone and industrials sales restart in earnest as the economy reopens, this will continue to slightly offset its performance in servers and cloud solutions.

Indeed, previously a large portion of Micron's chips used to go into smartphones. However, currently, with Apple AAPLpossibly deferring its new iPhone on the back of supply disruptions, this has a knock-on effect on many of its customers, including Micron.

For its part, Micron has stated that it has been quick to pivot into server sales, but its mobile unit continues to drag down its performance.

Micron recently pre-announced its results and stated that its upcoming Q3 2020 results to be reported at the end of this month are going to come out strongly.

In fact, compared to its guidance and its Q2 2020 results, even the bottom end of its pre-announced earnings are going to be higher than the top-end of its previous estimates.

Specifically, Micron now estimates that its non-GAAP EPS is going to be $0.75 to $0.80 compared with $0.70 of non-GAAP EPS (the top end of its previous guidance).

Even though some companies are thriving in our work-from-home economy, there are many that are struggling during this global contraction. Micron is not only meeting its guidance, but also raising its estimates and is still cheaply valued.

On the one hand, Micron is a cyclical stock. I argue, though, that its cycles are going to be substantially less severe going ahead, particularly over the very near term (12 months), than they were back in 2016 and throughout other memory slumps.

Nevertheless, to a huge extent, Micron sells a commoditized product, and if supply from competitors irrationally floods the market, there's little that can be done to overcome that environment.

Consequently, these points together somewhat depress Micron's multiple from getting too extended.

On the other hand, the demand for memory in servers, cloud, 5G, machine learning, and autonomous vehicles is only going to increase. Thus, there is a very strong secular tailwind to Micron's operations.

Also, during this downturn, Micron is still expected to be profitable, which is a vast improvement compared to the last downturn in 2016.

Taken together, given the right environment, Micron could well return to making $9 billion of free cash flow. Thus, without huge expectations, Micron's stock could trade at 10x to free cash flow (not earnings, but clean free cash flow, which is valued significantly higher).

Micron reports its results in approximately one month's time. By that time, the company's visibility should be significantly better than it was when it last reported its results.

We know that as conditions become more favorable, Micron is able to rapidly ramp up sales, which given Micron's operating leverage, means that getting closer to historically high free cash flow will be easier in the new normal than it was before.

See the article here:
Micron Has the Potential to Rise 50% From Here - TheStreet

Kofax Partners with Microsoft to Enhance Cloud-Based Universal Print Solution with ControlSuite – Industry Analysts Inc

Combined Solution Digitally Transforms Content Workflow, Governance, Reporting, Security, Compliance and Productivity

London UK June 4, 2020 Kofax, a leading supplier of Intelligent Automation software for digital workflow transformation, today announces a partnership with Microsoft. The partnership delivers more value to Microsoft Universal Print users by letting them seamlessly leverageKofax ControlSuite, the companys award-winning print management and cognitive capture software solution, across a wide range of printers and multi-function devices (MFD). As more enterprises migrate to the cloud and more highly distributed work forces become the norm, organisations need to provide employees with a secure and easy-to-use print and cognitive capture experience.

Universal Print, Microsofts cloud-based print infrastructure, addresses this with a simple, rich and secure print experience for users. When combined with Kofax ControlSuite, IT departments dont need to dedicate time and resources to configure print servers and local devices, thus alleviating much of the print administration effort and expense. Kofax ControlSuite works across the most comprehensive range of printers and multi-function device (MFD) brands and models, as well as mobile devices, and adds to this seamless experience.

Kofax and Microsoft share a common goal of helping customers digitally transform business operations and workflow to drive enhanced efficiency, productivity and experience, saysChris Huff, Chief Strategy Officer at Kofax. ControlSuite and Universal Print represent the future of work, providing customers a modern print infrastructure delivered through cloud services. Customers benefit from Microsofts cloud services while Kofax ControlSuite provides a single print management, cognitive capture and output management platform across the enterprise, resulting in reduced cost of ownership and improved, secure experiences for employees.

Kofax ControlSuites unique ability to provide print management, cognitive capture and output management means printing, scanning, content extraction and document workflows can be automated across any combination of hybrid systems and technologies. These include printers, MFDs, mobile and desktop devices, email and print streams.

Universal Print was designed to move key Windows Server print functionality to the Microsoft cloud so organisations no longer need on-premise print servers, and remove the need to install printer drivers on end-user devices, says Issa Khoury, Principal Program Manager Lead at Microsoft. Partnering with Kofax, were able to offer Kofax and Microsoft customers the joint value of Universal Print and ControlSuite and elevate user experience to the next level.

Microsofts Universal Print is currently in private preview and will be available soon to Microsoft 365 users. Microsoft and Kofax hosted a jointwebinardiscussing the functionality of Universal Print and Kofax ControlSuite.

About Kofax

Kofax enables organisations to Work Like Tomorrowtoday. Kofaxs Intelligent Automation software platform helps organisations transform information-intensive business processes, reduce manual work and errors, minimise costs, and improve customer engagement. We combine RPA, cognitive capture, process orchestration, mobility and engagement, and analytics to ease implementations and deliver dramatic results that mitigate compliance risk and increase competitiveness, growth and profitability. For more information, visitkofax.com.

# # #

2020 Kofax, Inc. Kofax is a registered trademark of Kofax Limited.

For further information, please contact:Nicola Males/Vidushi Patel07976 652491prkofax@vanillapr.co.uk

SOURCE Kofax

Kofax Customer Rabobank Innovates RPA Robot to Automate Government Relief on COVID-19 Related Loan Repayments

Originally posted here:
Kofax Partners with Microsoft to Enhance Cloud-Based Universal Print Solution with ControlSuite - Industry Analysts Inc

Dell and Intel answer the call for AI by building specific solutions for real problems – SiliconANGLE

With its position as a key provider of service and processor solutions to the worlds largest companies, Intel Corp. can see trends coming from miles away.

Thats the kind of perspective that over half a century in the technology business can provide. The company saw the coming of cloud computing, and now it is witnessing the latest wave of artificial intelligence, although at least one executive believes that growing use cases are moving into a phase where the tangible benefits of AI in the enterprise are real.

AI went through the same thing that cloud did, where you have every business leader or chief information officer saying: Hey, get me a cloud and Ill figure out what for later, said Lisa Spelman(pictured, left), corporate vice president and general manager of the Xeon and Memory Group at Intel. It was get me some AI, and well figure out if we can make it work. Were through those initial use cases, and were starting to see business value derived from some of those deployments.

Spelman spoke with Jeff Frick, host of theCUBE, SiliconANGLE Medias livestreaming studio. She was joined by Ravi Pendekanti(pictured, right), senior vice president of server product management at Dell Technologies Inc. Frick also spoke with Jeremy Rader, general manager of digital transformation and scale solutions at Intel, in a separate interview and heard from Thierry Pellegrino, vice president of business strategy and HPC solutions server and infrastructure systems at Dell. They discussed how AI is helping users manage data at significant scale, the impact of partnerships with independent software vendors, a joint project to map the human brain, and tailoring technology for the right solution. (* Disclosure below.)

As the use cases for AI continued to expand, Intel and Dell Technologies Inc. have partnered to provide solutions to businesses across a wide spectrum. One such customer is Epsilon Data Interactive Inc, a provider of permission-based email marketing services to major companies, including Dell.

Epsilon uses AI and machine learning to analyze customer activity in significant volume.

What really blew my mind is they service or send out close to 100 billion messages a year, so you can imagine the amount of data they are analyzing, Pendekanti said. Its all possible because of the kind of analytics we have driven into PowerEdge servers using the latest Intel Xeon processor coupled with some of the technology from the field programmable gate array side.

What is different from the get me some AI days is that both customers and providers such as Intel and Dell are building specific solutions into respective technologies to best address business needs while partnering with independent software vendors. This dynamic has created an ecosystemthat can help enterprises get desired results faster and with more impact.

It starts first with delivering the best hardware for AI, and Xeon is the foundation for that, Spelman explained. On top of that, theres the optimized software which is going into each of those frameworks and doing the work so that the framework recognizes the specific acceleration weve built into the CPU. Once weve done that software layer, this is where we have the opportunity for a lot of partnership.

Intel works with a number of ISVs in partnership with Dell to support customer needs. This forms a three-legged stool of value for the delivery of AI-based solutions.

What weve done with Dell is bring that portfolio together with Dells capabilities and then bring in that ISV partner, that software vendor, where we can really bring the most value out of that broad portfolio, Rader said. If you bring in the software vendor, hardware vendor, and Dell into the mix, you get a really strong outcome.

What are some of those outcomes? Researchers from McGill University and the University of Montreal are working on an artificial neural networkthat functions like a human brain. This specialized work requires the use of high-performance computing to process and analyze large-memory MRI images.

The project leverages Intels Zenith cluster combined with Dell PowerEdge servers and Xeon processors to make breakthroughs in cognitive science.

We collaborated with Intel on a tuning of algorithms for them in code in order to accelerate the mapping of the human brain, Pellegrino explained. Think about what you can get with that kind of information in order to cure Alzheimers or dementia down the road. It is using technology to help all of us and those who are suffering from really tough diseases.

Based on the experience of Dell and Intel, through projects such as the one in Canada, AI is beginning to move from an object of desire for no clear purpose to solving complex enterprise and human problems. It is being demystified and, for practitioners such as Dells Pendekanti, thats a good thing.

Most of us probably use an ATM to withdraw money, but we really dont know what sits behind the ATM, Pendekanti said. Our mantra for this is very simple. We want to make sure we use the right basic building blocks, ensuring that we bring the right solutions.

Heres the complete video interview, one of many CUBE Conversations from SiliconANGLE and theCUBE. (* Disclosure: Dell Technologies Inc. sponsored this segment of theCUBE. Neither Dell Technologies nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!

Support our mission: >>>>>> SUBSCRIBE NOW >>>>>> to our YouTube channel.

Wed also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we dont have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary onSiliconANGLE along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams attheCUBE take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here,please take a moment to check out a sample of the video content supported by our sponsors,tweet your support, and keep coming back toSiliconANGLE.

Go here to see the original:
Dell and Intel answer the call for AI by building specific solutions for real problems - SiliconANGLE

Data Protection As A Service Market Projection of Each Major Segment over the Forecast Period – Cole of Duty

The report Data protection as a service by development model (public cloud, private cloud, hybrid cloud) by service type (DRaaS, BaaS, STaaS) by end user (large enterprise, small enterprise, medium enterprise ) by region (North America, Europe, Asia Pacific, Middle East & Africa and Latin America) Global Industry Analysis and forecast 2018-2026. Global data protection as a service market was valued US$ 3 billion in 2017 and is expected to reach US$ 32.8billion by 2026 at a CAGR of 35.5 % during forecast year.

Request for Report Sample: https://www.trendsmarketresearch.com/report/sample/10288

Data protection as a service (DPaaS) is a cloud-based or web-delivered service for protecting data assets. Data protection is a vital part of IT infrastructure. Adding security to typical application is a way to expand the business into new markets. DPaaS tools support the use of technologies like VPN to aid in remote work security. Virtualization of storage devices and servers is one of the major trends gaining traction in the overall DPaaS market. DPaaS restores data more quickly than tapes or offsite backup.

Private Cloud is expected to dominate the market throughout the forecast.

The private cloud has greater data security and other advancements have revolutionized the deployment of DPaaS in these Enterprises. Private cloud implies that you manage the entire virtualization infrastructure from the componentry to the applications.

Backup as a Service (Baas) is leading data protection as a service.

Backup as a service (BaaS) is an approach to backing up data that involves purchasing backup and recovery services from an online data backup provider. Instead of performing backup with a centralized, on-premises IT department, BaaS connects systems to a private, public or hybrid cloud managed by the outside provider. Backup as a service is easier to manage than other offsite services.

Request for Report Discount: https://www.trendsmarketresearch.com/report/discount/10288

North America was the leading regional market for data protection as a service.

North America is the most profitable market as compared to others with diverse industry verticals implementing DPaaS at a greater extent. To generate the highest market revenue over the forecast period with predominant deployments in the large enterprises. Asia-Pacific is estimated to grow at the fastest pace owing to heavy investments by the governments into infrastructural activities for effective data protection.

This report includes a study of marketing and development strategies, along with the product portfolios of leading companies. It includes the profiles of leading manufacturers such as are IBM Corporation, Amazon Web Services, Inc., HP Development Company L.P. ,Commvault Systems, Inc., EMC Corporation,VMware, Inc. ,Quantum Corporation,Asigra, Inc. ,Veritas Technologies,Cisco Systems, HTC Corporation, Red Hat,Microsoft, Citrix , OneLogin, Infocom Corporation, IBM.

Maximize market research, a global market research firm with dedicated team of specialists and data has carried out extensive research about the global data protection as a service market. Report encompasses the market by different segments and region, providing the in-depth analysis of overall industry ecosystem, useful for taking informed strategic decision by the key stakeholders in the industry. Importantly, the report delivers forecasts and share of the market, further giving an insight into the market dynamics, and future opportunities that might exist in the global data protection as a service market. The driving forces, as well as considerable restraints, have been explained in depth. In addition to this, competitive landscape describing about the strategic growth of the competitors have been taken into consideration for enhancing market know-how of our clients and at the same time explain global data protection as a service market positioning of competitors.

Browse the market data Tables and Figures spread through a comprehensive research report and in-depth TOC on Global data protection as a service market.

Make an Inquiry before Buying: https://www.trendsmarketresearch.com/checkout/10288/Single

Read more here:
Data Protection As A Service Market Projection of Each Major Segment over the Forecast Period - Cole of Duty

PAM as a Service: Its All a Matter of Trust – Security Boulevard

With shifting priorities and dynamic technology environments, IT security teams are looking for ways to cover the most ground while draining as few resources as possible. Privileged access management (PAM) continues to be a priority for many organizations as compromised privileged credentials are linked to nearly all attacks. Today, with cyber attackers targeting organizations as they invest in new tools and technologies to support remote work, many security leaders are struggling with how to prioritize new investments and how to get the most out of their existing budgets.

This is where PAM as a Service can help.

Deploying PAM as a Service can help reduce risk by locking down access to a companys sensitive data, systems and applications while optimizing resources. It also doesnt require additional IT resources to manage on-premises infrastructure, perform upgrades, patches and more. Sounds great, right?

Before going down that path, though, its important to know what qualities are essential for a PAM as a Service solution you can trust.

SaaS solutions offer opportunities for companies to gain more control over their data, helping them understand how much data they have and where exactly it resides. While it is up to the organization to manage their own policies and users on the ground floor, any company looking to secure sensitive data, systems and applications on the cloud must trust the SaaS vendors that theyre handing the keys to.

Understand the security of the service. Frequently, businesses dont investigate how exactly security vendors manage and safeguard customer data. They simply assume that everything is completely secure. The American Institute of Certified Public Accountants (AICPA) provides independent assessments known as SOC 2 to help organizations understand exactly how companies safeguard customer data and how well those controls are operating. These reports cover the principles of security, availability, confidentiality and privacy.

Most vendors who have cloud offerings hosted via major cloud providers like AWS, Azure or GCP will tout SOC 2 Type 2 compliance but verifying that the service in question has this compliance check further demonstrates a commitment to security for customers. This is an important check to make before trusting a SaaS provider to keep your data secure and private and help ensure that the service will work how and when you need it to.

Know how data is stored and secured. This is a key component of SaaS itself and should be a major consideration. Communications from the cloud to corporate assets need to be encrypted both at rest and in transit. Secure SSH tunnels from the cloud provider to customer-operated systems like Active Directory servers, SIEM servers and others ensures that assets cannot be intercepted by malicious attackers.

Keep privileged account information safe. If privileged account information is going to be transmitted between the cloud and on-premises assets, investigate whether the cloud provider protects that network traffic is undecipherable and encrypted to prevent illicit information exfiltration. The principle of least privilege should be implemented when access is needed to upgrade backend systems and integrate new features. That access is denied on default and only permitted when essential.

Choose a cloud partner you can count on. Finally, the business stability of the vendor itself will show whether you have a partner in security that will be around for the long haul and able to keep up with rapidly changing demands of todays IT world. For SaaS, this is particularly relevant as cloud-first organizations change on the fly and need solutions that are secure and as nimble as they are.

To learn more about finding a privileged access management solution you can trust, join us June 19 at 12:00 PM ET for A Service You Can Trust, the final installment in the Friday 15 webinar series.

Recent Articles By Author

*** This is a Security Bloggers Network syndicated blog from CyberArk authored by Andrew Silberman. Read the original post at: https://www.cyberark.com/blog/pam-as-a-service-its-all-a-matter-of-trust/

The rest is here:
PAM as a Service: Its All a Matter of Trust - Security Boulevard

How To Best Adapt Your Business When The World Is Moving Online – Forbes

With the world on its way to moving online, social and work habits have seen a significant shift, forcing businesses in different industries to adapt to a rapidly changing environment. Companies of all sizes need to adjust in order to remain relevant. But how?

Moving Online

If your company is mainly operating offline, youll need to find ways to move into the digital world. For that, you need a server. If you have no experience, it will be challenging to decide between a virtualized server and a physical one. You will need to spend some time researching which type best suits your workload and business needs. But merely moving online will not be enough. Youll need to make sure that your infrastructure is reliable, scalable and cost-effective.

Upgrading Your Infrastructure

Regardless of whether you already have an online presence or not, you should be aware that high availability is critical: Your website always needs to be up and running. Increased downtime can result in broken infrastructure and error pages, and even short periods of downtime can damage your revenue streams.Having a scalable infrastructure allows you to adjust resources as necessary with minimal downtime -- for example, to dynamically increase CPU, RAM or storage size, according to your business needs.

Optimizing Your Website

Businesses used to optimize their websites to handle Black Friday traffic surges. Given the new shift to remote work, they are faced with a permanent Black Friday-like situation: increased orders, high traffic and other challenges. There are many ways to optimize your website and increase its resilience:

Bare Metal Versus Public Cloud

Public cloud servers use a resource pool from numerous dedicated servers; this allows resources to be allocated to virtual machines tailored for the needs of every client. The main reason many opt for cloud servers is that one can rapidly provision resources on demand, which makes them highly flexible, scalable and easy to use.

By contrast, on a dedicated server, also known as bare metal, you do not share resources with anyone. This type of server is a single-tenant machine, private to you, without any interference from other users. Superior performance, no resource restrictions and greater security are some of its benefits. Bare metal servers will not only offer better performance, but will also allow you to customize your settings to improve load times and facilitate optimal handling of traffic. Security is enhanced from the start since because metal servers isolate your data.

Cost-Effectiveness

Aside from trying to increase revenue during this period, most businesses will also try to lower their costs backstage. Although the public cloud seems like a cost-effective option for your small company, its not as cheap as it appears. Most cloud providers charge customers for each gigabyte of data sent between the domains currently in use, which adds up fast. Moreover, the more cloud environments a company operates, the more it ends up costing per month.

To avoid the extra public cloud costs, you could turn to dedicated servers, which offer customization capabilities for hardware and apps. The monthly fee is predictable because there are no additional charges, which helps a lot with budget planning. Compared to cloud environments, dedicated servers offer better performance, enhanced security and customizable settings that can meet your unique business needs. But are they as scalable and flexible as public clouds? Some bare metal providers have moved to a hybrid model that allows almost the same flexibility as a cloud.

The Future Is Online

During these uncertain and turbulent times, moving online is almost a must. However, you should do it right, or you might end up investing more than you gain from it. Choosing the right server type and optimizing it for your needs will be an excellent start for your digital journey.

The rest is here:
How To Best Adapt Your Business When The World Is Moving Online - Forbes

Cloud computing via satellite to drive 52 Exabytes of traffic by 2029: NSR – SatelliteProME.com

NSR predicts $11bn in cloud-based revenue over the next decade, with Satcom markets leading the way.

Cloud computing via satellite is projected to drive 52 Exabytes of traffic by 2029, with players in the satellite and space industry contributing cumulative revenue of $16bn from 2019-2029, according to a new report by NSR. The market is largely Satcom-centric, with a significant portion of the revenue flow going to service providers and satellite operators.

The transformation brought about by the adoption of Cloud Computing and Big Data analytics is only beginning to impact the satellite sector, states Shivaprakash Muruganandham, NSR Analyst and report author. It ranges from cloud-hosted applications by end customers to Cloud storage/processing by geospatial analytics providers. For a growing number of satellite operators and service providers, partnering with big IT and cloud players helps them drive increased bandwidth usage with existing customers.

The primary cloud verticals, maritime offshore and passenger cruise, along with aeronautical Satcom, will generate more than $7bn cumulatively over the 10-year forecast period. Additionally, onshore energy, gov/mil, and retail and banking Satcom markets are set to expand their cloud-first digitisation strategies.

Muruganandham adds: Surprisingly, the Earth observation industry was found to trail behind in terms of raw data traffic downlinked to cloud servers, while the downstream geospatial analytics segment shows strong signs of growth.

NSR recognises that cloud might not be a scalable cost for all, despite its tremendous value add. Still, newer applications such as the use of satellites for cloud storage and compute capabilities are nascent markets set to grow strongly over the coming decade.

Read the original:
Cloud computing via satellite to drive 52 Exabytes of traffic by 2029: NSR - SatelliteProME.com

Multinational Insurance Company Completes Upgrade of Majesco Policy for P&C from On-Premise to Majesco CloudInsurer to Bolster Growth Strategy -…

MORRISTOWN, N.J.--(BUSINESS WIRE)--Majesco (NASDAQ: MJCO), a global leader of cloud insurance software platform, today announced that long-time customer, Multinational Insurance Company completed its upgrade of Majesco Policy for P&C from an older version on-premise to the newest version on Majesco CloudInsurer for their commercial lines operation. Multinational Insurance has been a customer of Majescos since 2012 and is one of twenty Majesco customers in Puerto Rico.

Established in 1983, the Multinational Group has spent over thirty years expanding its operations and acquiring insurance companies of high prestige throughout Latin American and the Caribbean, including Puerto Rico. In 2017, the organization began their journey to move their core applications to the cloud to ensure business continuity and guarantee availability of their systems and operations. By working with a partner like Majesco, who has a well-defined disaster recovery process, they are now able to run business critical applications in secondary sites in the event the primary ones go down.

Majescos CloudInsurer has improved our overall performance and given us the ability to scale on-demand as we no longer need to depend on servers with limited capacity, says Mary Vargas, Vice President of Information Technology & Systems at Multinational. Weve also been able to limit the amount of manual resources needed to manage the administration of our infrastructure, servers and networking which has allowed us to focus more on the critical day-to-day business functions that require our attention. This was a cross-collaborative effort combining adept cloud skills from Majesco and exceptional support from all Multinational team members for achieving this upgrade.

By moving to Majescos CloudInsurer, Multinational has benefited from significant cost savings on licensing software that are typically used for web, application and database services. In addition, customers can now rely on a single source for managing service levels including application availability, platform updates, release management and incident response rather than coordinating with various parties.

Were thrilled to have helped Multinational upgrade Majesco Policy for P&C and at the same time move to CloudInsurer to strengthen their operations that can support their growth strategy in the market, says Prateek Kumar, EVP for Majesco. Our CloudInsurer platform with Majesco Policy for P&C are a competitive differentiator for customers because of the ready-to-use content and capabilities that accelerate speed to market, providing the foundation of on demand insurance in the cloud.

About MajescoMajesco (NASDAQ: MJCO) provides technology, expertise, and leadership that helps insurers modernize, innovate and connect to build the future of their business and the future of insurance at speed and scale. Our platforms connect people and businesses to insurance in ways that are innovative, hyper-relevant, compelling and personal. Over 200 insurance companies worldwide in P&C, L&A and Group Benefits are transforming their businesses by modernizing, optimizing or creating new business models with Majesco. Our market-leading solutions include CloudInsurer P&C Core Suite (Policy, Billing, Claims); CloudInsurer LifePlus Solutions (AdminPlus, AdvicePlus, IllustratePlus, DistributionPlus); CloudInsurer L&A and Group Core Suite (Policy, Billing, Claims); Digital1st Insurance with Digital1st eConnect, Digital1st EcoExchange and Digital1st Platform a cloud-native, microservices and open API platform; Distribution Management, Data and Analytics and an Enterprise Data Warehouse. For more details on Majesco, please visit http://www.majesco.com.

Cautionary Language Concerning Forward-Looking StatementsThis press release contains forward-looking statements within the meaning of the safe harbor provisions of the Private Securities Litigation Reform Act. These forward-looking statements are made on the basis of the current beliefs, expectations and assumptions of management, are not guarantees of performance and are subject to significant risks and uncertainty. These forward-looking statements should, therefore, be considered in light of various important factors, including those set forth in Majescos reports that it files from time to time with the Securities and Exchange Commission and which you should review, including those statements under Item 1A Risk Factors in Majescos Annual Report on Form 10-K, as amended by its Quarterly Reports on Form 10-Q.

Important factors that could cause actual results to differ materially from those described in forward-looking statements contained in this press release include, but are not limited to: the adverse impact on economies around the world and our customers of the current COVID-19 pandemic; our ability to achieve increased market penetration for our product and service offerings and obtain new customers; our ability to raise future capital as needed; the growth prospects of the property & casualty and life & annuity insurance industry; the strength and potential of our technology platform and our ability to innovate and anticipate future customer needs; our ability to compete successfully against other providers and products; data privacy and cyber security risks; technological disruptions; our ability to successfully integrate our acquisitions and identify new acquisitions; the risk of loss of customers or strategic relationships; the success of our research and development investments; changes in economic conditions, political conditions and trade protection measures; regulatory and tax law changes; immigration risks; our ability to obtain, use or successfully integrate third-party licensed technology; key personnel risks; and litigation risks.

These forward-looking statements should not be relied upon as predictions of future events and Majesco cannot assure you that the events or circumstances discussed or reflected in these statements will be achieved or will occur. If such forward-looking statements prove to be inaccurate, the inaccuracy may be material. You should not regard these statements as a representation or warranty by Majesco or any other person that we will achieve our objectives and plans in any specified timeframe, or at all. You are cautioned not to place undue reliance on these forward-looking statements, which speak only as of the date of this presentation. Majesco disclaims any obligation to publicly update or release any revisions to these forward-looking statements, whether as a result of new information, future events or otherwise, after the date of this press release or to reflect the occurrence of unanticipated events, except as required by law.

More here:
Multinational Insurance Company Completes Upgrade of Majesco Policy for P&C from On-Premise to Majesco CloudInsurer to Bolster Growth Strategy -...