Category Archives: Cloud Servers
Keeping Control Over Cloud With IPAM Sync Featured – The Fast Mode
Hybrid clouds are becoming ever more ubiquitous today. Over 90 percent of companies in Asia Pacific (excluding Japan) will rely on a mix of on-premise, public clouds and private clouds by 2021 to meet their infrastructure needs, according to a recent study by IDC.
At the same time, the opportunity for cybercrime has increased in tandem with the increased use of data that of infrastructure needs; it has also been compounded by disruptions to the technology industry - the COVID-19 pandemic being a case in point. The 2019 Singapore Cyber Landscape Report emphasised the importance of a resilient critical information infrastructure as one of its objectives to prevent significant disruptions to its economy and society.
For managing cloud infrastructures, the options are to use tools from the cloud provider, develop home-grown solutions, or to take advantage of a cloud management platform. Unfortunately, as a result, IT teams are likely to face challenges such as siloed management, limited visibility of resources, or having to manage multiple separate repositories.
Visibility is key for management
The fact remains that using multiple hosting providers for application workloads does not ease administration, operations or troubleshooting. It is a real challenge to maintain coherence between all infrastructure components located in multiple datacenters, cloud providers and IAAS solutions. When everything was hosted in a single big datacenter using a single VMware cluster it was more simple to manage, despite being distant from most infrastructure and operations teams.Therefore, having visibility from a single viewpoint is key for infrastructure management, so these teams require a central trusted repository which is accurate and up-to-date, wherever the workloads are running. This can be best provided by an IPAM (IP Address Management) solution. The repository helps not only simple management activities, but also automated network tasks and for handling more advanced requirements from business teams, such as auditing or security orchestration.
Why ongoing synchronization matters
Ensuring the central repository remains accurate requires Cloud IPAM Sync i.e. information to be synchronized in near real time between the Clouds and the IPAM. This ensures that the repository of information is kept up-to-date - a feature of growing importance in our fast-developing world of technology. Synchronization is an ongoing process that browses the Azure and AWS resources to find new ones that will be created in the IPAM, old ones that will be suppressed and existing ones that will eventually be updated. During the synchronization process, network automation linked to creation or destruction of a subnet object or an IP address will be automatically triggered. This enables pushing of the information to other systems such as billing, accounting, security or auditing. It also avoids tier systems performing their own discovery of resources inside multiple cloud environments and makes them use the IPAM as the central repository of information, the single source of network truth. Getting access to cloud inventory requires credentials, so centralizing the usage of these credentials does not compromise security in any way.
A cost control system requiring accurate knowledge of the amount of resources in a cloud is an example. Cloud solutions are proposing pay-as-you-go billing features and therefore are difficult to control. Generally 40% of IaaS servers in IaaS environments are not used for production and mostly utilized during working hours. As a result, default pay-as-you-go solutions for controlling resources are inefficient. Since the IPAM provides comprehensive visibility of all the resources currently running in various tenants, a comprehensive business analytics dashboard can track the evolution of running resources. This can then be correlated to the overall billing system resulting in significant cost savings over time.
Extend discovery to multi-cloud, include apps and devices
A robust solution is especially beneficial for the agility, reliability and security of cloud infrastructure for internal cloud resource inventory and network automation. If the Cloud IPAM Sync enhances the discovery process beyond internal datacenter boundaries, the IPAM can be considered as the central and unique source of truth for any IP-related information, particularly if applications and devices are included.
It is not an over-statement to say that advanced Cloud IPAM Sync functionality will benefit businesses in overcoming their hybrid cloud challenges. The single viewpoint visibility, unified management and control afforded through a central repository is key to helping businesses tide the wave of uncertainty and manage future challenges.
Read more here:
Keeping Control Over Cloud With IPAM Sync Featured - The Fast Mode
Jack in the Box Goes All-In on AWS – Business Wire
SEATTLE--(BUSINESS WIRE)--Today, Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), announced that Jack in the Box, Inc. (NASDAQ: JACK) is running its infrastructure on AWS going all-in on the worlds leading cloud to drive flexibility and resiliency across its organization and enable the company to offer new cloud-based experiences for the more than half a billion customers who visit its restaurants every year. Jack in the Box migrated from its on-premises data centers to AWS, improving the performance and reliability of its IT infrastructure, and positioning the company to better serve customers in the digital age.
Jack in the Box chose AWS for its proven expertise in supporting the restaurant and hospitality industry, its comprehensive set of cloud services, and its scalable infrastructure, which are enabling the company to improve operational efficiency throughout the business. For example, by moving off of Oracle and Microsoft SQL Server legacy databases to Amazon Relational Database Service (RDS) and Amazon Redshift for data warehousing, Jack in the Box was able to automate time-consuming IT administration tasks such as hardware provisioning, database setup, patching, and backups, as well as cut software and hosting costs. Jack in the Box and its franchise restaurant operators now use a common operational dashboard powered by AWS to analyze sales, inventory, food safety, and labor patterns, enabling them to focus on the daily performance of the business rather than the undifferentiated heavy lifting of backing up their servers and other tasks.
Jack in the Box is also leveraging the worlds leading cloud to innovate enhanced digital ordering, dining, and customer service experiences for its guests. Going all-in on AWS means Jack in the Box can leverage AWSs broad portfolio of machine learning (ML) services to reveal more powerful insights into its customers tastes and habits. This includes Contact Lens with Amazon Connect, a set of ML capabilities integrated into a cloud-based contact center service, to enable customer service to better understand customer conversation sentiment and trends. Jack in the Box will look to use these insights to more effectively train its customer service agents, replicate successful interactions, and identify product feedback that can be used to develop new menu offerings and promotions.
Moving forward, Jack in the Box is actively planning future innovations, including helping its restaurants more accurately predict customer traffic and optimize service time and food costs using Amazon SageMaker, AWSs service for building, training, and deploying ML models. In addition, in response to increased use of its mobile app and delivery service as customers altered their routines this year, Jack in the Box also plans to use Amazon Personalize, an AWS ML service for creating individualized recommendations, to present customers with tailored suggestions on new food and beverage options and add-ons.
As one of the nations first hamburger chains, we pride ourselves on being a leader in fast food innovation, offering customers creative new menu items and the ability to customize their meals. By using AWSs full portfolio of cloud services, we can continue to innovate new customer experiences while providing valuable information to our franchisees to help them operate more efficiently, said Drew Martin, Chief Information Officer, Jack in the Box. AWS gives us the ability to be a more flexible, resilient, and data-driven organization, which is essential for our business to understand and adjust to the impacts of challenges such as COVID-19, flexibly scaling and contracting our resources to optimize how we operate.
Jack in the Box has been delighting restaurant goers for nearly 70 years. Now, with AWSs proven infrastructure and deep portfolio of services powering their IT operations, they are able to expand their use of digital channels like online ordering and delivery apps to continue earning the loyalty of future generations, said Greg Pearson, Vice President, Worldwide Commercial Sales at Amazon Web Services, Inc. By going all-in on AWS, Jack in the Box can spend most of their time innovating versus having to navigate multiple platforms, giving the company and its restaurant franchise owners the ability to understand their customers better and anticipate their needs, while also providing the scale and flexibility to quickly respond to changing business operating conditions.
About Amazon Web Services
For 14 years, Amazon Web Services has been the worlds most comprehensive and broadly adopted cloud platform. AWS offers over 175 fully featured services for compute, storage, databases, networking, analytics, robotics, machine learning and artificial intelligence (AI), Internet of Things (IoT), mobile, security, hybrid, virtual and augmented reality (VR and AR), media, and application development, deployment, and management from 77 Availability Zones (AZs) within 24 geographic regions, with announced plans for nine more Availability Zones and three more AWS Regions in Indonesia, Japan, and Spain. Millions of customersincluding the fastest-growing startups, largest enterprises, and leading government agenciestrust AWS to power their infrastructure, become more agile, and lower costs. To learn more about AWS, visit aws.amazon.com.
About Amazon
Amazon is guided by four principles: customer obsession rather than competitor focus, passion for invention, commitment to operational excellence, and long-term thinking. Customer reviews, 1-Click shopping, personalized recommendations, Prime, Fulfillment by Amazon, AWS, Kindle Direct Publishing, Kindle, Fire tablets, Fire TV, Amazon Echo, and Alexa are some of the products and services pioneered by Amazon. For more information, visit http://www.amazon.com/about and follow @AmazonNews.
About Jack in the Box
Jack in the Box Inc. (NASDAQ: JACK), based in San Diego, is a restaurant company that operates and franchises Jack in the Box restaurants, one of the nations largest hamburger chains, with more than 2,220 restaurants in 21 states. Known as the pioneer of all-day breakfast, and the late night category, Jack in the Box prides itself on being the curly fry in a world of regular fries. For more information on Jack in the Box, including franchising opportunities, visit http://www.jackinthebox.com. If you have media inquiries, please reach out to media@jackinthebox.com.
Read this article:
Jack in the Box Goes All-In on AWS - Business Wire
Nutanix Clusters takes on-premises Nutanix to AWS Blocks and Files – Blocks and Files
Nutanix is ready to announce Nutanix Clusters. This brings the on-premises Nutanix experience to AWS and opens another front in the companys battle with VMware.
Sources close to the company say Nutanix Clusters in AWS (NCA) has been in an early-access test phase for many months and is now robust and ready to move into general availability.
NCA runs on bare metal all-flash servers in AWS and uses AWS networking. Customers spin up servers using their AWS account and deploy Nutanix software on them. This process uses AWS Cloud Foundation, Amazons facility to provision and model third-party applications in AWS. On-premises Nutanix licenses can be moved to AWS to instantiate NCA there.
VMware uses its ESX hypervisor as an overlay atop AWS networking and this can sap resources and become a performance bottleneck, according to Nutanix sources.
NCA supports four types of AWS instance, including Large CPU, Large Memory and Large Storage.The Nutanix Prism management console can be used to manage NCA.
NCA bursts on-premises Nutanix deployments to AWS to cope with spikes for example, an immediate requirement to add 1,000 virtual desktops. It also has disaster recovery capabilities.
Customers can use NCA to migrate on-premises applications running on Nutanix to AWS. There is no need to re-engineer applications as the software runs the Nutanix environment transparently across the public cloud and on-premises worlds.
When no longer needed, a Nutanix cluster in AWS can be spun down with its data stored in S3, incurring S3 charges only, until spun up again. The the spun-up NCA is rehydrated from the S3 store. We understand this go-to-sleep facility will follow the main NCA release in a few months.
A blog by Nutanix CTO Binny Hill provides more background on NCA and there is more info an early-access solution brief.
Visit link:
Nutanix Clusters takes on-premises Nutanix to AWS Blocks and Files - Blocks and Files
How the FT prepared for a world without third-party cookies – The Drum
Permutive won the 'Best Sell Side Innovation category at The Drum Digital Advertising Awards 2020 for its collaboration with The Financial Times. Here, the team behind the entry reveal the challenges faced and strategies used to deliver this successful project.
The challenge
The Financial Times is one of the worlds leading news organisations, recognised internationally for its authority, integrity and accuracy. It topped a million subscribers in 2019 (some 75% of them digital), a year ahead of schedule and has hefty ambitions moving forward. Operating a split revenue model, the FT does not rely on subscriber revenues alone: it also includes advertising with branded content playing a larger role in delivering that advertising revenue.
However, with the introduction of privacy-focused laws such as GDPR and browser changes, including Apple Safari and Mozilla Firefox anti-tracking measures, the FT knew it needed to keep ahead of changes without losing the ability to target its audiences.
Knowing that Google could also clamp down on cookies (it has since announced that third party cookies will be phased out by 2022), the FT needed to change the way it operated - and fast. Chrome accounted for just under half of the FTs impressions in 2018-19, so if it continued to rely on third party cookies, its revenue would have taken a big hit. Googles own figures suggest that without third-party cookies, publisher revenues would drop an average 52% on its platform.
FT.com needed a real-time and first-party cookie solution to unlock its valuable audiences and provide clients the scale they demanded.
The strategy
The FT turned to Permutive to combat the following challenges:
Privacy and regulation - third-party data is becoming increasingly redundant as laws and browser changes take effect. FT was also looking for more efficient methods in responding to GDPR requests.
Browser changes - Apples ITP was the trigger for the FT team to prepare for a cookie-less world. For every new ITP release Permutive estimates that publishers experience up to a 60% drop in programmatic revenue on the browser.
Reporting was time consuming - The reporting and analysis on its legacy DMP was too manual, making it time consuming and less effective at informing decision-making.
Workarounds were not working - FT found that most vendor alternatives were not publisher-driven; they were also focused on finding workarounds to maintain third-party cookies and continue working. These were quickly being eliminated by browser updates.
Audience segmentation - The FT knew that its clients wanted to know more about their audiences: from the segment they were targeting to how a campaign performed on site, as well as what learnings they could take into the next campaign.
The solution
All segments from the FTs existing DMP were recreated in Permutive and since Permutive does not rely on third-party cookies, the FT can collect, analyse and activate its entire audience across all devices and browsers. Additionally, Permutive is built on edge computing, unlike traditional DMPs built in the cloud. This means that data is processed on the users device and isnt sent back-and-forth to cloud servers.
All of FTs segments were historically built using frequency and recency, but with Permutive it is now starting to build out segments based on behaviour on site such as total engagement time and, potentially, scroll depth. This will help improve CTR within campaigns and is a focus for 2020. That additional layer is also helping its sales team to build a stronger narrative to take to market.
The FT can now target users in milliseconds to serve relevant audience-targeted advertising and deliver more information to clients on its users and campaign performance, whilst also being more secure as data doesnt leave the device mitigating against data leaks and providing GDPR-compliance.
The results
The project increased scale, revenue and privacy compliance for FT. Adopting Permutive helped from a privacy perspective as the previous DMP utilised a network-wide domain meaning audience behaviours could be collated across any publisher. Utilising the FT.com domain removed any risk of data being pooled.
Permutive allows FT to target users based on engagement, and learn more about its readers and how they interact with, specifically, its marketing. It has added another weapon to the FTs commercial arsenal. Whereas before it had one or two different data sources it can now unlock a whole other layer in terms of proving who its audience is, demonstrating certain interests that are relevant to the client outside of demographic information.
The FT makes the most of the analytics feature within Permutive, having previously been reliant on manually inputting data into excel sheets to draw information on audience segments. It can now can easily segment users at a granular level using all of the information it is collecting about on-site behaviour. For segmentation and analysis, it can also look back at all historical data, with none of the previous limits. This builds a much more insightful picture of all FT users, making it easy to package audiences for advertisers.
We looked at other vendors but Permutive stood out because it was a publisher-focused, real-time DMP. It was the right time to be talking about a first-party data system, and once we could see how the technology works on device, rather than in the cloud, it made a lot of sense. Were seeing improvements in scale of inventory as well as revenue, and campaign performance is seeing an uplift, too. Its all helping our sales teams build a much stronger narrative to go to market with. - Paris Luc, digital targeting manager, the Financial Times
This project was highly commended at The Drum Digital Advertising Awards 2020. To find out which Drum Awards are currently open for entries, click here.
// Featured in this article
Financial Times
The Financial Times is one of the worlds leading news organisations, recognised internationally for its authority, integrity and accuracy. It is part of Nikkei Inc., which provides a broad range of...
Go here to read the rest:
How the FT prepared for a world without third-party cookies - The Drum
Samsung Electronics : – Defining the Boundaries of Communications – marketscreener.com
Communication is about sharing information with others.
The evolutions of communications technology has enabled us to be more connected than ever before, meaning that information can be shared anytime and anywhere.
In mobile communication, a business with a well-established global ecosystem, from equipment manufacturers to telecommunications operators, common rule is essential to keeping the ecosystem moving forward collaboratively. This is where the process of standardization comes in, which sets internationally agreed-upon standards to give users access to better products and services at lower prices. A representative example demonstrating the benefits of international standardization is the global roaming service, which allows users travelling to foreign countries to use their mobile devices as they are.
Standardization is one of main driving forces behind the growth of the mobile communication industry since a new generation has been introduced once every decade. 'Large-scale investments into mobile communication have been triggered when each new generation of communications is commercialized,' explained Dr. Han. 'When certain countries or companies run their businesses with proprietary solutions, the risk of failure increases.' This means that the chance of success can increase only when the stakeholders of the mobile communication ecosystem come together to define the most relevant technologies and discuss aspects like implementation early enough. 'Determining communications standards and developing products following these standards is an equi process,' noted Dr. Han. 'These standards are crucial.'
Standardization is two-fold: the de jure standards obligated by regulators and the de facto standards established by the global communications industry which, while not compulsory, specify unified ways of operation for stakeholders around the world to follow. The Standards Research team of Samsung's Advanced Communications Research Center oversees both standards.
'For example, in order to utilize the extremely high frequency band (mmWave) for 5G, de jure standardization is a prerequisite for the commercialization of any device using the band, which includes assigning a set of frequency bands to mobile communication, setting regulated conditions such as maximum transmission power and out-of-band emission, and ensuring its safety for the human body and existing devices,' explained Dr. Han. 'We are also simultaneously developing protocol technologies and working on de facto standardization to include these technologies into the standards by participating in standards developing organizations such as 3GPP (3rd Generation Partnership Project) and IEEE (Institute of Electrical and Electronics Engineers).' Dr. Han emphasized that both de jure and de facto standards are equally important.
Working as a Communications Standard Expert
Frequency bands are a limited resource. It is inevi that different parties will clash over acquiring such an in-demand resource, which is why each frequency band is already allocated to a specific purpose, e.g. fixed communications, mobile communication, broadcasting, satellite, or other uses. The extremely high frequency band adopted for 5G was an unexplored territory from the perspective of mobile communication. When Samsung initially proposed it, there was pushback at first.
Standards experts are supposed to take the initiative of reserving such new spectrums for the mobile communication industry. 'By stressing mobile communication's contribution to the economy, we managed to persuade the governments of each country, and attracted more supporters by showing them the feasibility of applying this extremely high frequency band to mobile communication,' recalled Dr. Han. 'We actively presented many details to justify our claim, including the simulation results of a coexistence study. As a result, we were able to have this extremely high frequency band assigned to 5G.'
'There is no almighty judge when it comes to fairly determining which technology among many candidates should be selected as a part of the standard. Moreover, any technology has its own pros and cons,' said Dr. Han. 'There is a decision-making process inherent to standardization. Proposals are first made by companies, intensive and technical debate on each proposal then follows, and participants finally build a consensus to reach a conclusion. We have to avoid sticking to our own interests. Instead, we are trying to communicate with other stakeholders to find the best way forward based on an understanding of the industry as a whole. When we take care of the ecosystem, proposals that we develop to make it healthy and sustainable will be supported by the majority as a result.' Similar to the role of the diplomat, standardization experts participate in global standardization conferences and will there represent their company or their country. They are expected to be the best in their own field. 'As we are contending at the forefront of these international discussions, technical competitiveness is the key requirement for Samsung delegates,' explained Dr. Han. 'Therefore, in our projects, anyone who is most competitive in a certain area is designated as the champion of the area, regardless which team he or she belongs to.'
Standardization, the Next Phase of 5G
4G is a communications technology designed to enable the wireless broadband service for smartphones. In particular, 4G as a universal communications platform aggressively adopted the Internet protocol that was popularly used in past wired packet communications. Therefore, many Internet-based services could easily migrate to cellular systems. 5G, then, is designed to expand its territory from the broadband service for smartphone users to vertical markets including the smart factory, automobile, healthcare, private network, smart city, and more. 4G as a universal solution led to a huge growth of the communications market. On the other hand, 5G aims to create new markets based on its new design principle of customizable networks to fulfill the specific requirements of a particular industry sector.
To realize the innovations that 5G has promised, Dr. Han and his team have been working on Rel-16, the second version of 5G. 'Rel-15, the first version of 5G, laid a new framework for the technology and focused on how to provide differentiated experiences to conventional customers, i.e. smartphone users,' noted Dr. Han. 'We joined the global collaboration to develop Rel-16 in order to realize the 5G vision. Rel-16 introduces and enhances 5G's features for vertical markets. For example, V2X1 is for connected cars, industrial IoT communications is for smart factories and the data analytics function has been improved for network AI.'
Even though 5G has been commercialized, the standardization of 5G for further enhancements will never stop. Until the launch of 6G, the 5G standard will continuously evolve in order to improve and expand 5G. 'As soon as we concluded the development of 5G's second version, we immediately began work on the third version, Rel-17,' commented Dr. Han. 'We have discovered some areas to improve commercial 5G networks with, including coverage expansion and NR-MIMO (Multiple Input Multiple Output). These will be amended and enhanced in the upcoming versions. Furthermore, we will continue to discover new features to add in order to enable new 5G applications. Innovations we are looking at include media delivery for AR glasses-type devices and edge computing enablers for low latency services from cloud servers close to users.'
Standardization of Edge Computing, Further Enhancement for 5G Services
Samsung is constantly pushing the boundaries of 5G in order to bring its unique experiences to users. One key characteristic of 5G is its ultra-low latency, brought about by its nine-tenths latency reduction in the radio access link between terminal and base station as compared to the previous generation. In order for users to experience the quality of ultra-low latency services, the end-to-end latency between the user terminal and the cloud server should be reduced. Samsung believes that edge computing will solve the rest of this puzzle, this being latency reduction in the backbone network, by placing the server closer to users. Thanks to 5G and edge computing, users will finally be able to enjoy 5G's signature service on their devices.
'The link between a device and its server was out of 3GPP's scope,' said Dr. Han. 'But it is also hard for other standards organizations who are not experts in 5G to develop the standard for edge computing without a complete understanding of 5G systems.' Due to this difficulty, attempts were made to develop edge computing-enabled communication using proprietary solutions - which would lead to serious market fragmentation. 'Samsung initiated discussions on edge computing inside 3GPP and persuaded other participating companies. We are now leading the standardization effort for enabling edge computing in 5G systems as one of the key items of Rel-17.'
In 2009, Samsung began the early stages of 5G research with the question of 'how can we improve cellular networks to be 10 times better than 4G LTE' Samsung will continue to develop further enhanced technologies for the future of 5G. 'Samsung plays various key roles in the influential standardization organization for mobile communications and leads those standards and related technologies,' explained Dr. Han. 'Based on our perseverance for over 10 years in this field, we will overcome whatever obstacles we encounter and will make 5G a big success.'
Making a Better World - Through Technology
Dr. Han began working in this field because when he was a student, he was extremely curious about who made standard specifications, the ground rules that were akin to a communications bible. And today, he is leading the team shaping the future of communications with standards. What resolution has he set
'When we worked on LTE standards, we did not even expect that the term 'LTE', back then only used by selective standard engineers, would become a common and popular term,' noted Dr. Han. 'This experience reminded me that the technologies we create can change the world and the daily lives of people. We are also aware of high expectation from 5G that we have developed. I firmly believe that our work will benefit the world.'
Dr. Han is also working on promoting Samsung's 6G vision to inspire people in this field. 'In the future, the main customers in the communications market won't just be human, but will include robots and other machines, too,' explained Dr. Han. 'People will start to enjoy hyper-connected experiences and be able to explore reality in a virtual world without temporal or spatial constraints. 6G will present fundamental technologies for such innovations. We will begin communicating with stakeholders as per Samsung's 6G White Paper, published on July 14. Our 5G experience and the insights captured in our 6G vision will help us prepare for the long journey toward another success story with 6G.'
'Moreover, the sustainable growth of society and the communications industry will be key considerations for shaping 6G.'
Contact:
Tel: 1800 407 267 864
Originally posted here:
Samsung Electronics : - Defining the Boundaries of Communications - marketscreener.com
This Week in Storage, featuring Qumulo, Actifio and more Blocks and Files – Blocks and Files
This week, AWS has made Qumulo filer software available in the AWS government cloud; Actifio backs up SAP HANA to object storage in GCP; and LucidLink is doing cloud collab stuff with Adobe Premiere Pro.
Scalable file system supplier Qumulo has announced its availability in the AWS GovCloud (US) through the AWS Marketplace.
Qumulo says Government organisations can now integrate their file data with legacy applications in private cloudand cloud-native applications in AWS GovCloud (US) with a single file data platform.
The company is working with Corsec Security Inc. to gain various US government certifications for its software. The company said it aims to make Qumulo the strategic choice for all types of Controlled Unclassified Information (CUI) and unclassified file data., as well as the upcoming FIPS 140-2 and Common Criteria EAL2+ certifications of its platform.
NetApp, a Qumulo competitor, this week announced Azure NetApp Files is in the Azure government cloud
Copy data manager Actifio is waving a tech validation report from ESG that says it reduced backup and disaster recovery (DR) infrastructure costs by 86 per cent when protecting SAP HANA workloads with Google Cloud Platform (GCP) object storage. The comparison is with legacy backup approaches using high-performance block storage.
ESG found the same high levels of performance from a DR copy running off Google Nearline object storage as their production instances running on Google Persistent disk block storage.
ESG Senior Validation Analyst Tony Palmer said: Cloud object storage is typically 10x inexpensive than the cloud SSD/flash block storage. Actifios ability to recover SAP HANA database in just minutes from cloud object storage, while delivering the I/O performance of an SSD/flash block storage is very unique in the industry and reduces cloud infrastructure costs by more than 80 per cent for enterprises.
You can download the ESG Actifio SAP HANA Technology Review.
LucidLink, which supplies accelerated cloud-native file access software, is partnering with Adobe Premiere Pro so its users can edit projects directly from the cloud.
Generally, Adobe Premiere Pro video editing software users edit local files because access is fast. However, team working and remote team working require multi-user access to remote files. LucidLinks FileSpaces can provide teams with on-demand access to media assets in the cloud that are accessed as if they were on a local drive.
Sue Skidmore, head of partner relations for Adobe Video, said With so many creative teams working remotely, the ability to edit Premiere Pro projects directly from the cloud has become even more important. We dont want location to hold back creativity. Now Premiere users can collaborate no matter where they are.
Filespaces provides a centralised repository with unlimited access to media assets from any point in existing workflows. The pandemic has encouraged remote working. Peter Thompson, LucidLink co-founder and CEO, provided a second canned quote: Our customers report they can implement workflows previously considered impossible. We are providing the missing link in cloud workflows with streaming files.
Actifio has announced technical validation and support for Oracle Cloud VMware Solution (OCVS), Oracles new dedicated, cloud-native VMware-based environment.OCVS enables enterprises to move their production VMware workloads to Oracle Cloud Infrastructure, with the identical experience in the cloud as in on-premises data centres. It integrates with Oracles second-generation cloud infrastructure. OCVS is available now in all public regions and in customer Dedicated Region cloud instances.
Taiwan-based Chenbro has announced the RB23712, a Level 6, 2U rackmount server barebone (no CPUs, fitted drives) with 12 drive bays designed for storage-focused applications in the Data Center and HPC Enterprise. It pre-integrates an Intel Server Board S2600WFTR with support for up to two, 2nd GenerationXeon Scalable Processors. The RB23712 offers Apache Pass, IPMI 2.0 and Redfish compliance, and includes Intel RSTe/Intel VROC options.
Microchip Technology has introduced the latest member of the Flashtec family, the Flashtec NVMe 3108 PCIe Gen 4 enterprise SSD controller with 8 channels. It complements the 16-channel Flashtec NVMe 3016 and provides a full suite of PCIe Gen 4 NVMe SSD functions. The 3108 is intended for use by M.2 and the SNIA Enterprise and Data Center SSD Form Factor (EDSFF) E1.S drives.
Nutanix says it has passed 2,500 customers for Nutanix Files. Files is part of a Nutanix suite for structured and unstructured data management, which includes Nutanix Objects, delivering S3-compatible object storage, and Nutanix Volumes for scale-out block storage.
Penguin Computing has become a High Performance Computing (HPC) sector reseller and solution provider of Pavilion Hyperparallel Flash Arrays (HFA).
Quantum has announced its ActiveScale S3-compatible object store software has been verified as a Veeam Ready Object Solution.
Synology has launched new all-flash storage and a line of enterprise SSDs. The FS3600 storage system is the newest member of Synologys expanding FlashStation family of network-attached storage (NAS) servers. Synology has also announced the release of SATA 5200 SATA SSDs and SNV3400 and SNV3500 NVMe SSDs.
The FS3600 features a 12-core Xeon, up to 72 drives, and 56GbitE support. The new SSDS can fit in its enclosure and have 5-year warranties. They integrate with Synology DiskStation Manager (DSM) for lifespan prediction based on actual workloads.
Data replicator and migrator WANdisco said it is the first independent software vendor to achieve AWS Competency Status in data migration.
Zerto is reprinting a short Gartner report: What I&O leaders need to know about Disaster Recovery to the cloud. The report assumes that by 2023, at least 50 per cent of commodity-server workloads still hosted in the data centre will use public cloud for disaster recovery. Its an eight-minute read and you can get it, with minimal registration.
Follow this link:
This Week in Storage, featuring Qumulo, Actifio and more Blocks and Files - Blocks and Files
10 billion records sit in unsecured databases – China leads the pack – SecurityBrief New Zealand
China, the United States, India, Germany, and Singapore are the top five countries with the most unsecured databases in the world or at least thats according to new research from NordVPN.
The security firm partnered up with a white hat hacker to scan Elasticsearch and MongoDB libraries for unsecured databases, over the space of one year.
The hacker uncovered a total of 9517 unsecured databases, collectively containing more than 10 billion entries that's 10,463,315,645 entries containing data such as emails, passwords, phone numbers, and other sensitive information.
China topped the list with 3794 exposed databases, containing a collective of more than 2.6 billion (2,629,383,174) detected entries.
The United States wasnt too far behind, with 2703 exposed databased and 2.4 million (2,397,583,255) entries.
India had 520 exposed databases with 4.9 million entries; Germany had 361 exposed databases with 248 million entries; Singapore had 355 exposed databases with 2.3 million entries.
Rounding out the top 10 most exposed databases include France, South Africa, the Netherlands, Russia, and the United Kingdom.
Other countries included South Korea, Ireland, Vietnam, Hong Kong, Brazil, Japan, Canada, Iran, Australia, and Taiwan.
NordVPN warns that although some of the exposed entries could be junk and only used for the purposes of testing, it could be hugely damaging if sensitive information were exposed.
NordVPN points to recent data leaks including a case where 540 million Facebook records were exposed on Amazon cloud servers.
Furthermore, search engines such as Shodan and Censys scan the internet constantly, enabling people to gain access to open databases. NordPass security expert Chad Hammond says anyone could scan the internet in as little as 40 minutes.
Security threats, such as automated Meow attacks that destroy data without reason or ransom, also place unsecured databases at more risk.
Hammond says, Every company, entity, or developer should make sure they never leave any database exposed, as this is obviously a huge threat to user data.
He adds that database protection should include data encryption at rest and in motion, identity management, and vulnerability management.
All should be encrypted using trusted and robust algorithms instead of custom or random methods. Its also important to select appropriate key lengths to protect your system from attacks.
Identity management is another important step and should be used to ensure that only the relevant people in an enterprise have access to technological resources.
Finally, every company should have a local security team responsible for vulnerability management and able to detect any vulnerabilities early on, he concludes.
See more here:
10 billion records sit in unsecured databases - China leads the pack - SecurityBrief New Zealand
Finding the Right and Secured Video Platforms for your Business – Security Boulevard
It is really not an easy life for the internet based OTT services providers, be it for the pay TV cable operators or the new internet players. Users have become used to having all of their entertainment sources in all their devices all the time, from their e-books to digital music with no compromise in videos. In the meantime, there has been a rise in the expectations, from the entertainment studios, to get their content protected from any illegal use.
Generally, the technological complication of constructing, sustaining and streamlining of these multiscreen OTT services is not going down. OTT players require a variety of skills that includes video streaming, data protection, application support and other technical infrastructures. However, no single parameter, fully stacked with all these competencies, has come up in front of the OTT operators so they could depend on, to create their services that are accessible, inter-operable and automated.
Digital Rights Management is a digital authorizing system allowing the content administrators to monitor the how and by whom parameters of the content consumption.
DRM is often misunderstood with encryption. As it goes, that encryption is the method of complicating the digital info, while DRM is the comprehensive process of managing the content access. It includes the delegation of the locking and unlocking keys, backend authorizing systems with various features such as policy adherence and downloaded playback control.
The content authorities need personalised marketable DRMs to safeguard their content. In order to get access to any kind of content from the content authors, broadcasters, OTT operators or the network distributors, there is a compliance to using the few chosen DRM systems.
Hypertext Transfer Protocol Secure is a method of ensuring safe live streaming solutions and video communication over the internet. Netscape, initially, developed this to secure the online traffic using Secure Socket Layer (SSL). Since then Transport Layer Security (TSL) support has also been added to its aid. HTTP is not actively linked to streaming of videos; however, it has become a custom to use for actual HTTP applications and so for HTTP video streaming. Now we shall see how HTTP works in OTT.
Recently, HTTPs has been more commonly used for streaming. Some of the major video streaming players like Facebook, Netflix etc require HTTPs for streaming videos on their platforms. When the online traffic is sent in clear, means it is streamed over unsecure HTTP and the metadata for the video streaming session is at risk. Randomly, anyone can copy any data about the browsing session like video title, user id details etc. On a higher level, anyone can record and study the info related to Netflix traffic and what all content titles are being majorly streamed and by whom.
Using HTTPS, the transaction and the metadata info with the OTT streaming platforms and the users are safeguarded by establishing a secure channel between the two. Hence, HTTPs ensures full confidentiality of the users and their video streaming history.
AES Advanced Encryption Standard aids in protecting the video content when it is paused or is at transmission phase. It is implemented as a symmetric block cipher that can be skillfully applied on software, hardware or any other process to protect or encrypt the overall amount of the sensitive content. AES is the newest version of DES (Data Encryption Standard) which was developed in early 1970s by IBM.
The content protection in AES is very much similar to as explained in HTTPs. AES encrypts the content in a way that it will need the user to use special keys while requiring access over HTTPs.
In a nutshell, AES encrypts the video streaming in such a way that it gets nearly impossible for the frauds to steal the confidential data from your account, even if they could access the video sessions, they still cannot watch the videos.
Essentially, there are three levels of accessing to videos, which are:
In general, any user who has access to a streaming network can view both obtainable and membership videos, depending on the type of access granted to them.
There are exceptions to access, like sometimes the user can access the protected video libraries but is unable to watch the videos, it is because when they have not been authorized to access the protected video content. To have that access, users need to hold a special key that is sent to them over emails, after placing a special request (regarding the upgradation of their memberships) or via mails in some cases.
In order to have the special access, the user needs to get himself verified and validated with their ID proofs and payment card details, so as to ensure complete user authenticity. These details are stored on servers against each user IDs and a full activity log is maintained with a clear status of their access level. Every time any member logs into their account, a complete security check is done to cross-verify all minute details.
With the intervention of HTML5 video formats, the player can be used to write in HTML5/JavaScript which can be operated directly in the browser, unlike a separate application so long the browser can connect with one or more than one DRMs. With HTML5, there is no longer a requirement to depend on either a platforms in-built player code or third party stacked player codes that operate independently from the browser.
Few countable videos have limited viewing rights across the network. With IP based locational restrictions you can protect your video content from being watched by any random user across geographies, except the ones who are allowed. This encrypts your content from global video piracy. Having geographical restriction on your video sessions ensures protection from being downloaded and watched from far off locations. Even if it happens, you have the option to blacklist the whole location to break the access. However, it is not recommended as it is not safe, but still provides a twin layer of protection.
The process by which every time a video is uploaded through live streaming services gets stored in a data centre, which is administered by the Content Delivery Network (CDN). It is a decentralized network of cloud servers that uses complex software-based procedures to stream videos globally. A CDN minimizes the possibilities of encountering shaky videos, buffering issues and content delay. CDN protects your video streaming from any kind of online attacks known as DDoS attacking technique.
Continued here:
Finding the Right and Secured Video Platforms for your Business - Security Boulevard
How to Create an Infrastructure for a Remote-Ready School – EdTech Magazine: Focus on Higher Education
When stay-at-home orders and physical distancing were implemented earlier this year, many schools scrambled to transition to remote learning. But not the Academy of Our Lady of Peace, where I serve as technology director.
The academy the oldest high school in San Diego and the only all-girls school south of Los Angeles already had technology in place that enabled us to seamlessly pivot to online instruction. We had zero downtime, and our 750 students didnt miss a single day of learning.
One reason for our success was that we were already fully in a cloud environment. We began the transition to cloud-based solutions a few years ago when our Microsoft Exchange Server reached its end-of-life status. Instead of automatically investing in a new onsite server, we moved forward with G Suite for Education because of its intuitive interface and robust back-end administrative capabilities.
mixetto/Getty Image; Logo by Amira Martin
Read the rest here:
How to Create an Infrastructure for a Remote-Ready School - EdTech Magazine: Focus on Higher Education
How cloud is turning to be an effective tool for healthcare industry during Covid-19 – Express Computer
By Khushboo Jain
Healthcare in the digital age has become a place where a tremendous amount of data is generated on a daily basis. Patients medical and financial details, as well as any research, are just some of the data that is generated, and maintaining a quick and secure database is of utmost importance.
With the coronavirus outbreak, hospitals and clinics are being overwhelmed with patients. The amount of data that needs to be generated or shared and the speed at which it needs to occur puts a lot of pressure on healthcare professionals. Luckily for them, cloud computing could provide a quick, secure, and cost-effective solution.
Cloud computing comes with a unique set of benefits that can greatly benefit the healthcare sector.
Management of serversThe advantage of cloud-based systems for healthcare is that managing data is not the job of the healthcare provider. With talented IT professionals keeping a watch and managing the system, healthcare providers are able to focus on other important facets of healthcare.
Cost benefitsWith cloud computing, it is easier to oversee the services you pay for and take decisions that are cost-effective. By making a custom plan to fit your needs, you can negotiate a deal that is a lot more cost-effective than setting up your own systems.
Designed to manage a tremendous amount of dataAs stated earlier, Healthcare and its related sectors generate a lot of data. For example, medical images like scans are extremely detailed and generate high-resolution images, utilizing a lot of data. A lot of this data needs to be stored for the patients entire lifetime, not to mention be kept secure. Physical storage is inconvenient and cloud computing provides an easier alternative.
Fast speedsWith patient numbers increasing, speed is of utmost importance. Accessibility to faster cloud servers makes it easy to upload, share, and recover data at a quick pace. It also gives us the ability to make changes faster. Exchange of data and communication between healthcare workers, hospitals, research centers, and funding services like medical crowdfunding creates a better healthcare environment. Time is of the essence in healthcare and with cloud, we can now be a lot more time-efficient.
Security and protectionCloud computing has come a long way when it comes to addressing security concerns. The use of private and hybrid cloud systems has ensured that the medical and financial details of a patient remain secure. For example, if a hospital has a patient that needs to raise funds using a crowdfunding platform, there can be a secure exchange of data between the platform and the hospital using cloud systems. Moreover, the remote servers keep it more protected from any on-location hazard and also reduces any hassles during data recovery.
The opportunities that Cloud computing gives to the Healthcare systems:
ScalabilityThe needs of the healthcare service provider may change with time. Scaling the cloud services according to their requirements is easy. Cloud allows you to scale up or down quickly, allowing you to meet your current needs or prevent unnecessary expenditure, and also allow for future growth.
Ability to updateTechnology is in a constant state of change and innovation. As systems upgrade, data will need to be changed/updated. Whenever these changes do occur, updating data using cloud will be much easier and quicker. Having a cloud-based system will enable you to update your data, applications, and systems as quickly as possible.
Allowing easier collaborationsDuring the digital age, the sharing of resources is important to create better opportunities for patients. For eg. collaborating with other healthcare providers can provide better services while collaborating with crowdfunding and other alternative funding options enables patients to afford them. Collaborations like these create a better healthcare system for everyone.
Using cloud data in telemedical practicesDuring this pandemic, doctors and patients alike are at risk of contracting the virus in hospitals. During this critical time, telemedical practices can help healthcare workers continue to provide safe healthcare remotely. These modern medical systems need to transfer the patient data back and forth at high speeds, something that cloud can be used easily, while also maintaining the doctor-patient privacy. By involving cloud computing in telemedical systems, we can now have a safe system, both physically and digitally.
More resources to focus on the medical needsFrom all that we can assimilate from the advantages that cloud-based systems have, we can come to the conclusion that such systems can drastically reduce the number of resources that would be required from the healthcare systems to manage data. It saves time, money, and other important resources. The availability of these resources allows healthcare service providers to concentrate on providing better services, which should be their primary focus.
The early adopters of cloud services have been able to reap the benefits of it for some time now. This has only proved that cloud computing is not only viable, but essential to healthcare, and needs to be adopted now more than ever before.
(The author is Co-Founder and COO, ImpactGuru.com)
If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]
Originally posted here:
How cloud is turning to be an effective tool for healthcare industry during Covid-19 - Express Computer