Category Archives: Cloud Servers

The buzz about cloud-based document management systems and why it is likely to become mainstream – YourStory

In 2001, IDC reported that workers creating, managing or editing documents were spending up to 2.5 hours a day on average searching for what they needed. By 2012, IDCs Information Worker Survey reported that workers were spending just about five hours per week searching for documents. The reduction in the time to find information can be attributed to technology solutions in the area of document management. Despite all the technological advances, documents continue to be stored and managed in an unstructured way, electronically, making accessibility and security a key challenge. A more recent survey conducted by the Economist Intelligence Unit (EIU) shows that employees spend 25 percent of their time searching for information to do their jobs.

Most documents continue to remain disconnected across organisations of all sizes. The information is often spread across emails, chats, documents, spreadsheets, slides, each of which may reside with different users and in silos, making data gathering extremely challenging.

In addition, with many users wanting to have easy accessibility, the documents are copied, often multiple times, leading to a growing volume. Industry studies point out that 30 percent of document accesses are often unsuccessful because of the document being misfiled or disappearing or challenges in access controls. And, all of this translates into considerable costs, sometimes directly or in the form of loss of employees work time. This is where a digital document management solution (DMS) becomes relevant.

A digital DMS can address key challenges that manual document management falls short, explains K Bhaskhar, Senior Vice President, Canon India, BIS Division. With a cloud-based document management system, there is almost 100 percent uptime so that documents can be accessed from anytime, anywhere. The system is secure because there is a constant upgradation of the firewall and the overall security software. Most importantly, it is a cost-effective proposition for any organisation as it works on a shared services model, thereby contributing to an organisation becoming more collaborative, agile and efficient.

And sometimes because the work processes are not transparent enough, it might be extremely difficult to find a particular document that is needed or who is in charge of that document. Not just that, with employees managing too many documents, it is quite possible for them to lose track of the documents, resulting in overshooting deadlines. In a cloud-based DMS, there is version control and changes made to any document can be easily tracked.

In contrast, DMS, especially cloud-based DMS, not only addresses these challenges but also ensures process efficiency, secure and confidential access to data and is cost effective. Today, cloud-based DMS with its high-performance servers and automatic text recognition, make it possible to store, manage and access large volumes of documents easily. A user can get to the desired document almost instantly without having to search for hours in file servers or folders for a single piece of information they need.

Venkatesh agrees, saying cloud-based DMS is better than having a dedicated internal SPOC to manage the access and management of official documents. However, he says that the clear advantages to the success of cloud-based DMS depend on two critical factors. One, the cloud-based DMS should integrate with existing systems to make the experience and scale seamless as opposed to providing another set of siloed tools for document management. Any technology solution, no matter how impactful it is, will not get used unless it weaves into the existing way of working, be it integrating with e-mails, social collaboration tools, identity management tools, access control systems, among others. Second, a critical element is ensuring that there is a push for adoption.

And, when adoption of document management is successful, benefits are more appreciated if we look at the intangibles in terms of ease at which teams can now get access to information they need, says Venkatesh. Sharing an example, he says, With a DMS in place, a sales SPOC can now get access to authorised information and playbooks of other customers that can be shared at any given point of conversation to drive home the point for a potential customer. The DMS with its in-built security features ensures that only relevant content is shared without compromising on security. He explains that this immediate access to relevant information can be critical in driving the customer delight factor in a sales conversation. With DMS, there is no dependency on people for access to information they need to do their job which would otherwise take anywhere between a few hours to days, he adds.

Agami a network organisation which was dependent on a paper-based document management system prior to the pandemic, today has made the shift to digital solutions.

Shifting to digital DMS not only eliminated the need for a dedicated office space for managing the paperwork but also brought about efficiency in the workflow. For instance, the immediacy of being able to get access to a document and get it signed digitally translates into a huge advantage, he says.

Today, amidst the pandemic, cloud-based DMS adoption has seen an accelerated rise, as has the case been across technology solutions addressing different use cases in digitisation. The world today has become fast, from fast food to fast disbursal of cash through ATMs. So why shouldnt access to documents be fast? While adoption has seen an upward curve over the last few years, the curve has been steeper since 2019. And the adoption of DMS increased significantly since the onset of the pandemic and shift to a hybrid work environment, says Bhaskhar. He adds that the adoption has been primarily driven to meet the needs of Finance, HR, and IT functions in particular.

And, with organisations shifting to a hybrid workplace structure for the long-term, Bhaskhar opines that DMS will become a mainstay technology for companies. Here, he shares that it is a best practice for organisations, especially startups, to start with the digital management of their documents as early as possible. This will create structure and transparency of information at all levels of the company. It will be easier to manage the flow of information when the startups start achieving economies of scale, he says.

Today, DMS is witnessing interesting innovations. The innovations with respect to digital right management, context-aware access, user provisioning and deprovisioning will further enhance the impact of DMS solutions, says Venkatesh. Bhaskar points out Canon's DMS Therefore to further illustrate.

The increasing shift to a hybrid workplace, work becoming more collaborative and complex and innovations in DMS are likely to further make the case stronger for businesses of all sizes across sectors to adopt DMS.

See more here:
The buzz about cloud-based document management systems and why it is likely to become mainstream - YourStory

Virospack acquires IFSs technology to improve its operational efficiency – Cosmetics Business

4-Nov-2021

Packaging

Virospack, the specialist in droppers for cosmetics, has reached an agreement with the softwareprovider IFS to use their IFS Cloud solution, with the aim of optimising their production andmaterial traceability, as well as reinforcing their environmental commitment

The production of dropper packs may seem simple and, in fact, just three pieces make up most ofthese droppers. The Spanish company Virospack has made these products its business, and with65 years of experience employs 500 people, who design and make high-end cosmetic packs forwell-known international brands.

However, there are numerous challenges facing the production of these droppers, from optimisingthe design and managing the materials used in production, to the development of increasinglysustainable products, which reduce the environmental impact.

This is one of the reasons that has led Virospack to reach an agreement with the Swedishbusiness software provider IFS to integrate the IFS Cloud solution, which will be implemented in itsentirety, including the HR, Finance, Business Intelligence, Project Management, CRM, Quality,Logistics and Purchasing modules.

Companies can use our software to measure how sustainable they are. Thanks to IFS Cloud,Virospack will have multiple parameters that will allow it to reduce its carbon footprint, highlightedthe IFS Country Manager for Spain and Portugal, Juan Gonzlez, in an interview with EuropaPress given to mark the beginning of this collaboration with the Barcelona-based company.

Technology and digitalisation are an increasing reality for companies, with 52% of companiesaffirming that they will increase their spending on digital transformation, according to an IFS study.

However, in addition to improving efficiency, this may also be a lever for environmental change.

Although servers consume electricity, the impact on energy consumption can be reduced by usingcentral servers, in addition to other benefits, such as avoiding the use of paper.

In line with this, Virospack is looking to further improve its current environmental commitments. TheBadalona-based company already has the ISO 14001 certificate for environmental managementsystems and uses 100% renewable energy at source at all of its facilities, 100% certified wood andrecycles 80% of hazardous waste.

As part of the collaboration agreement, both companies will work together over the next 12 monthsto develop tools adapted to Virospack's operation. It is expected that IFS's solutions will begin tobe used in December 2022 as a result of this strategic agreement, and their implementation willbegin this December.

This is according to Virospack's Service Manager, Montserrat Florencio, who explained thatthere will be a complete integration of all the transactions of IFS Cloud. Antalis ConsultingServices, an IFS partner, will also be involved in this implementation, providing their experience inproduction.

Another cornerstone of the agreement focuses on efficiency, which can be improved thanks to theuse of the IFS Cloud solution, launched in March to replace IFS Applications. It allows customersto manage all of their products in a unified way, on a single API-based platform designed for thecloud, but which can be run anywhere.

According to IFS's Country Manager, the company stands out from other business softwareproviders as it provides a single solution, which brings three products together in one platform:the company's ERP, i.e. Enterprise Resource Planning, field management and fixed assetmanagement. This integration may result in improved efficiency and avoid integration problems.

The improvements in efficiency will result from improved interdepartmentalcommunications, providing the benefit of more streamlined flows and greater efficiency,according to Montserrat Florencio.

In Virospack's case, traceability was another aspect that warranted the agreement with IFS. Thisaspect affects both the environmental commitment -it allows the carbon footprint of each product tobe traced- and other aspects, such as the materials used or even the fact that documents can beaccessed with a single click.

Better traceability will increase our efficiency. We are already working in this area, but it isdifficult to obtain data as we have to use different programs, applications and tools, saidVirospack's Service Manager.

The arrival of the Covid-19 pandemic put a strain on many manufacturing sectors, such as in thecase of semi-conductors, and the global industry is already suffering from the shortage ofprocessors. However, Virospack managed to obtain a growth in sales of more than 20% in 2020compared to the previous year.

The company that specialises in dropper packs has highlighted its commitment to innovation, withits agreement to use IFS's ERP, but also with a parallel project to create a new automatedwarehouse, which will be launched simultaneously.

Innovation is not something that is improvised, it is not just for the production, design andindustrialisation of a product, we also look to provide the company with technologicalinnovation that allows us to improve our current processes, added Florencio.

The arrival of the pandemic benefitted us, despite its impact, agreed Gonzlez, who highlightedthe efforts of companies to provide their employees with professional software tools. Softwarecompanies have to commit to making the employee's life easier, he concluded.

See the rest here:
Virospack acquires IFSs technology to improve its operational efficiency - Cosmetics Business

Moving to the cloud: Resistance is futile but not for everyone – TechGenix

No. 1 on Gartners top 10 strategic technologies for 2010 was cloud computing. It was once again top of the list in 2011, but then 2012 saw cloud computing demoted to No. 10. It danced around the list over the next few years and eventually started to take on different personas. Personal cloud was given a middle-of-the-road placing in 2013, and then in 2014, it advanced into the architecture behind the cloud. Makes sense. And then 2015 introduced us to the concept of computing everywhere. If we do a little bit of forensic analysis, it seems that by 2016 enterprise organizations had embraced the cloud, and it moved from trendy to operational.

The reasons are many. Offloading the heavy lifting required to acquire, set up, and maintain the internal infrastructure required for todays technology is a huge weight off any corporate executives shoulders. The ability to access applications and data regardless of location or time zone means an increase in productivity. There is an often-heard argument that cloud computing is more cost-effective, although there is always another side to that argument. Nonetheless, many enterprise organizations have made the move to the cloud. But not all. One cannot help but wonder, with all of the positive press around the cloud, not to mention the skilled sales tactics employed by profit-maximizing vendors, why there still are organizations resisting the urge to jump on board and migrate to the cloud?

Stripped of all the fancy wrapping, the cloud is really just someone elses server. Which means that the security around the service that is provided is only as good as the practices employed, not to mention the skillsets of service provider employees. Dropbox, OneDrive, and Google, oh my! The increased use of cloud platforms has also meant an increase in security concerns. Insecure APIs, external attacks, and compliance risks are all at the forefront of concerns contemplated by cybersecurity professionals. But it is also true that we would have these same concerns if our servers all resided in an onsite locked room. The advantage to hackers is that data is now centralized and therefore creates a rather irresistible target. So perhaps security concerns are a wash. We will have the same concerns no matter where our data resides. The thought that when servers are onsite, we can maintain greater control is true. It also means that we have the fiscal and legal responsibility to ensure the security of that data. To some, it is less stressful to contract that responsibility to a service provider dedicated to overseeing security. Overall, service providers are quite disciplined when it comes to applying security patches and updates. Which is something that is known to not always be true for localized IT departments.

Companies are seldom willing to give up those requirements that make them unique. To move to a hosted application in the cloud means conforming to the configuration limitations imposed by the vendor. Service providers need to do this. If they were to offer customization to every customer, their offering would be much too expensive to be attractive to the average corporate consumer. In addition, it would become impossible for them to maintain upgrades. Only those organizations that are willing to let go of nonstandardized business processes can benefit from migrating applications. Of importance is the knowledge that repeatable and standardized processes are much less expensive to operate and maintain. While it is true that onsite applications can also be developed by disciplined and mature organizations to follow documented and standardized processes, it is also expensive for organizations to build workarounds on the fly. And these workarounds, far too often, turn into the regular process.

An often-heard argument against cloud service providers is the limited access to data. And it is true that organizations with high turnover may find themselves at a disadvantage when choosing to move to a new service provider. In an effort to keep our data secure, cloud service providers will ask their customers to identify by name those who will require different levels of access to data. It is administratively heavy to change these contact names for obvious reasons, and we do want it to be a difficult procedure with many checks and balances. Unfortunately, this can be construed as not having access to data that is rightfully owned by the customer. On the other side of this equation, it is important to ensure that during the procurement process, it is clear that the data is owned by the customer and we have access to it as required. This includes the ability to download all required data should we choose to change our service provider.

Pixabay

Yet another data consideration is what vendors feel entitled to do with the data entered and stored by organizations. While legal compliance, in theory, represents the rights of customers, it is important to understand that we need to have a legal interpretation of the contract we are signing. This is yet another risk that smaller organizations encounter. It is an expense that can become unmanageable with less respectable vendors.

Can your server connection handle moving the kind of data that your organization requires? Connectivity is one of the more controversial topics of cloud services. It is also one that is often not discovered until after go-live, which can be the cause of a very poor user experience. When selecting a cloud service provider, there are a few requirements that can help to alleviate this issue. First of all, select a service provider that has multiple datacenters. Latency can be caused by distance, and it is a good idea to know that you will have the option to move to a datacenter in a different location if this turns out to be a persistent issue. Another issue may have nothing to do with cloud anything. Ensure that the bandwidth of your Internet service provider is understood and that it meets the suggested requirements of the cloud service provider. And then, of course, there is the bandwidth of your internal network. This needs to be tested before cutover.

Enterprise technology projects are expensive and time-consuming. They also require the engagement of many operational team resources and stakeholders. There are times in every organizations life when it is wise not to take on another large project. Timing is everything. If adequate resourcing and timing are not allocated, a poorly planned configuration can cause a headache that could stick around for a very long while.

The ability to host applications in the cloud, usually via a vendor that provides the application as a service, has been a savior in current times. The ability to access applications from any location, not to mention removing the need for a server room fully staffed with expensive technology resources, has been an advantage under our current work-from-home regime. The concept is solid, and the positive results have been proven. However, there are legit reasons that enterprise organizations may hold off migration projects. Overall, there does seem to be a direct correlation with trust. Trust of vendor partners and cloud service providers. Trust that our Internet service providers can handle the constant flow of data packets we will need to send, and even trust that our stakeholders have built and can maintain standardized and repeatable processes. While technology continues to offer new solutions to compensate for the constant challenges thrown at the business world, it seems that the issues that keep a CIO awake at night have not changed much over the years.

Featured image: Shutterstock

Post Views:17

Home Articles Moving to the cloud: Resistance is futile but not for everyone

Continue reading here:
Moving to the cloud: Resistance is futile but not for everyone - TechGenix

Real words or Buzzwords?: Cloud Native IoT – SecurityInfoWatch

Oct. 26, 2021

A continuing look at what it means to have a 'True Cloud' solution and its impact on todays physical security technologies

(Image courtesy bigstockphoto.com)

(Image courtesy bigstockphoto.com/agsandrew)

(Image courtesy bigstockphoto.com/blackboard)

(Image courtesy bigstockphoto.com/Beebright)

(Image courtesy bigstockphoto.com/wutzkoh)

This site requires you to register or login to post a comment.

No comments have been added yet. Want to start the conversation?

(Image courtesy bigstockphoto.com/agsandrew)

(Image courtesy bigstockphoto.com/blackboard)

(Image courtesy bigstockphoto.com/Beebright)

(Image courtesy bigstockphoto.com/wutzkoh)

Courtesy of Getty Images -- Credit: ra2studio

(Sara Jerde/NJ Advance Media/TNS)

(Image courtesy bigstockphoto.com)

(Image courtesy bigstockphoto.com)

(Mike Stocker/Sun Sentinel/TNS)

Sign up for Security Info Watch eNewsletters

See the original post here:
Real words or Buzzwords?: Cloud Native IoT - SecurityInfoWatch

Endless regression: hardware goes virtual on the cloud – E&T Magazine

In the summer of 2018, professors John Hennessy and David Patterson declared a glorious future for custom hardware. The pair had picked up the Association for Computing Machinerys Turing Award for 2017 for their roles in the development of the reduced instruction set computer (RISC) architectural style in the 1980s.

Towards the end of their acceptance speech, Patterson pointed to the availability of hardware in the cloud as one reason why development of custom chips and the boards they would be soldered onto is getting more accessible. Cloud servers can be used to simulate designs on-demand and, if you have enough dollars to spend, you can simulate a lot of them in parallel to run different tests. If the simulation does not run quickly enough, you can move some or all of the design into field-programmable gate arrays (FPGAs). These programmable logic devices wont handle the same clock rates as a custom chip but they might only be five or ten times slower, particularly if the design you have in mind is some kind of sensor for the internet of things (IoT), where cost and energy are more important factors than breakneck performance.

The great news that's happened over the last few years is that there's instances of FPGA in the clouds, said Patterson. You dont have to buy hardware to do FPGAs: you can just go the cloud and use it. Somebody else sets it all up and maintains it.

A second aspect of this movement is being driven by projects such as OpenROAD organised by the US defence agency DARPA. This aims to build a portfolio of open-source hardware-design tools that lets smaller companies create chips for their own boards instead of relying on off-the-shelf silicon. In principle, that would make it easier to compete with bigger suppliers who traditionally have been able to deploy customisation to improve per-unit costs.

For more than a decade, those bigger silicon suppliers have used simulation to deal with one of the main headaches in custom-chip creation. Getting the hardware to boot up and run correctly is one thing. Getting the software to run often winds up a more expensive part of the overall project. As debugging software for a chip that doesnt exist yet is tricky, they turned to simulation to handle that. Even if the hardware is not fully defined, it is often possible to use abstractions to run early versions of the software, which is then gradually refined as the details become clearer. The old way of handling that was to use some hardware and FPGA combination that approximated the final design and have it running on a nearby bench. That is changing to where its not just hardware designers running simulations, its increasingly the software team.

When we started 12 or 13 years ago, everyone was doing simulation for hardware to get the SoC to work, says Simon Davidmann, president of Imperas, a company that creates software models of processor cores. We founded Imperas to bring these EDA technologies into the world of the software developers. We learned with Codesign [Davidmanns previous company] that software development would become more like the hardware space.

A second trend is the pull of the cloud. The designs may run on models that trade off accuracy for speed on a server processor in the cloud or a model loaded into an FPGA or a mixture of both. As Imperas and others can tune their models for performance by closely matching the emulated instructions to those run by the physical processor, a typical mixture is to have a custom hardware accelerator and peripherals emulated in the FPGA and the microprocessors in fast software models.

Davidmann says the trend towards the use of more agile development approaches in the embedded space is driving greater use of simulation. Even hardware design, which does not seem a good fit for a development practice that relies on progressive changes to requirements and implementations, has used them. One of the main reasons for this is the extensive use of automated testing. Whenever code whether its hardware description or software lines gets checked in, the development environment does a bunch of quick tests with more scheduled for the nighttime. If the new code triggers new bugs, it gets sent back. If not, the developer can continue.

This continuous integration and test relies on servers being available and ready to run the emulations and simulations whenever needed. That, in turn, points to the cloud, as it is easy to spin up processors for a battery of tests on demand. Even if the target hardware has finally come back from the fab, simulation still gets used. Though one way to test in bulk on finished hardware is to run device farms basically shelves stacked with the target boards and systems they present maintenance issues. They are always breaking and often have the wrong version of the firmware, Davidmann says. Moving to continuous integration doesnt work that well with hardware prototypes.

You can quickly push new versions to simulations in the cloud, turn them off and on again virtually. And, funds allowing, run many of them in parallel, which can be vital if a team has to meet a shipping deadline with a shipment-ready form of the firmware.

Now, the use of simulation is moving even further into the lifecycle, as evidenced by Arms launch of its Virtual Hardware initiative last week. The core technology underneath this is the same as that used to support conventional chip designs, including fast processor models similar to those provided by Imperas and others.

In its current form, Arm Virtual Hardware is limited in terms of the processors it supports. The off-the-shelf implementation thats in a free beta programme covers just one processor combination: the recently launched Cortex-M55 and its companion machine-learning accelerator. The presence of the accelerator provides much of the motivation for the virtual-hardware programme.

Stefano Cadario, director of software product development said at Arms developers summit last week, one of the driving forces behind the programme is the steep increase in the complexity of software with several factors: managing security, over-the-air updates and machine learning.

Where so much of the interaction the embedded device has is with cloud servers that deliver software updates as well as authenticating transaction, it makes sense to be able to run and debug that in the cloud. But machine learning presents a situation where updates will be far more frequent than they are today. The models will typically be trained off-device on cloud servers as the target hardware does not have the performance or raw data to do the job itself. Potentially, devices could get updated models every night, though the frequency will most likely be a lot lower than that.

Development teams need to be sure that a new model wont upset other software when loaded, which points to regression testing being used extensively on simulated hardware in the cloud. That automated testing potentially makes it possible for the machine-learning models to be updated by specialist data scientists without the direct involvement of software writers, unless there is a big enough change to warrant it. The result is a situation where Arm expects customers to routinely maintain cloud simulations for years, through the entire lifecycle of the production hardware.

As with existing virtual-processor models, the Arm implementation makes it possible to gauge performance before a chip has made it back from the fab. According to Cadario, Cambridge Consultants used an early-access version to test the software for a medical device and Googles Tensorflow team optimised the machine-learning library for the accelerator earlier in the development cycle than they would normally.

Arm has not yet said which, if any, other processors would be added to the programme. However, it seems likely that it will not go outside the companys own portfolio. Where we are different is that we support heterogeneous platforms, Davidmann says. Weve got some of the largest software developments using our stuff because it can support heterogeneous implementations.

There will still be a place for prototype hardware, not least because field trials of ideas will still have to take place before suppliers commit to hardware. But if there is a push towards the use of more custom hardware, it will be cloud simulation that helps drive it.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Read the rest here:
Endless regression: hardware goes virtual on the cloud - E&T Magazine

How bare metal servers enhance the work of online media – www.computing.co.uk

Content production is increasing rapidly, and so are data transfer speeds. Providers who are eager to stimulate demand for their services must consider users' requests. Dedicated servers as-a-service, which we are all accustomed to, are the best option to choose.

Verizon Media makes use of hundreds of thousands of machines with more than 4 million cores. Two years ago, Verizon Media senior director architect James Penick said, "We've understood that we need to build a foundation before building the house. Bare metal servers form the basis of our infrastructure. It's like hardening the building using concrete and reinforcing bars."

Bare metal servers help Verizon Media to create the ideal infrastructure for performance optimisation and carry out standard quota requests within a few minutes.

A bare metal server is a physical server located in the cloud. Unlike virtual machines, bare metal servers apply single-user control over the equipment. The client can manage all hardware resources alone, control the server load directly, and work independently from other users' virtual machines.

Bare metal servers can be used in many different ways: you can either deploy virtual machines, or dedicate the whole node to just one project as well as to multiple containers. Bare metal servers are often chosen by media platform developers and those working on applications that require high speed and data security. Let's discuss why.

Unlike 'virtual machines', dedicated nodes can cope with resource-intensive tasks more efficiently. According to research, the performance of virtual machines used workloads requiring high data processing speed is up to 17 per cent lower than bare metal servers. That's because dedicated server users have full access to their servers and can use all computing resources on their own.

We even knew this back in 2015. In his op-ed column on TVTech, Media Systems Consulting founder Al Kovalick emphasised the advantages of productive dedicated servers for the media industry.

"Bare metal servers have no virtualisation layer. The workloads are deployed on the servers with an OS that has certain preliminary configurations, but the cloud provider doesn't introduce any additional software. It's up to the user to define which software stack to use over the OS. These servers are managed through a control panel that monitors the deployment process, the server capacity and how the server is used. Dedicated servers have maximum performance. The user controls the entire software stack, and there is no virtualisation tax."

Some cloud providers allow you to use high-performance NVM disks and Intel Xeon Scalable processors of the 3rd generation (Ice Lake). For example, we started integrating these processors into our infrastructure in April 2021, together with other pioneers. Such equipment allows you to solve any problem quickly.

Imagine that you're launching a new 'Twitch killer.' According to twitchtracker.com, this year the video streaming platform's users have viewed more than 1 trillion minutes of video content. Nowadays an average 720p ('HD') video takes about 900 MB per hour, meaning that Twitch servers send users over 46.5 million gigabytes every day. These are approximate calculations, but this is a good example of what kind of server infrastructure an "average entertainment portal" needs.

According to the Digital 2021 United States Of America report, this year the average download speed on mobile and stationary devices in the United States reached 67.3 MB/second and 174.6 MB/second respectively.

Content providers need to satisfy users' growing demands and to consider their opportunities. They must compete with giants like Netflix if they want to gain users' attention in the media space. Thus content delivery speed and quality must be at a high level, which requires infrastructure with the corresponding bandwidth - which is most achievable with bare metal servers.

According to the Ericsson Mobility Report, 63 per cent of mobile traffic comes from videos. By 2025, this value will have reached 76 per cent. The growing market share of streaming media is responsible for such rapid changes.

If media platforms, services and portals want to be successful in the growing content consumption market, they need to provide maximum performance. At peak load times, i.e. during calendar events, significant sports broadcasts, etc., the opportunity to choose and configure all equipment components becomes even more important than ever.

In the G-Core Labs' public cloud - Bare-Metal-as-a-Service - customers can automate all of these tasks (including equipment configuration, orchestration, and adding new dedicated servers) using APIs. This will allow you to scale your platform quickly and to make sure it meets your clients' resource needs.

"Unexpected performance decrease is what people dislike about cloud computing," saidMark Stymer, President of Dragon Slayer Consulting. "If you opt for a bare metal server, your server's performance is more predictable, and the provider can scale it up on demand and do it relatively quickly."

If you publish a lot of content, you'd better choose a provider with a broad points of presence network. The equipment should preferably be located in Tier III or IV data centres at larger traffic exchange points. In this case, there will be no need to worry about high content delivery speed and reliability.

Level

Idle time (hours per year)

Fault tolerance value (%)

Tier I

28.8

99.671

Tier II

22

99.741

Tier III

1.6

99.982

Tier IV

0.4

99.995

Data centre levels, as classified by Uptime Institute

Many media platforms have certain user security policies. No matter if it's a corporate social network or an adult content site, user data and content must have reliable protection. Recall the 15 biggest data breaches of the 21st century - and bear in mind that start-up projects constantly undergo several minor risks.

G-Core Labs' users have access to enhanced protection against DDoS attacks. In case of a server attack, the traffic is redirected to a threat mitigation system (TMS), which detects attacks, filters the traffic and allows only harmless data to reach the server. Users can configure the TMS protection policies on their own. The main advantage is that the IP doesn't get blocked during an attack, and the server remains accessible to users.

When creating a media business, be it a streaming platform, a small independent media source publishing its content, or a photo and video hosting site, you need to decide how to optimise the company's resources. Dedicated servers provide excellent opportunities to achieve this. The providers' pricing policies allow you to configure the required server precisely enough, and as a result you won't have to overpay for any idle megabit of memory. Cost predictability is another advantage provided by bare metal servers.

For example, our dedicated server tariff enables you to select individual parameters. You can install RAID or change its type, increase the disk volume and the number of disks, increase RAM volume, replace SSDs with HDDs and vice versa, and install 10 Gbps network cards. You can also configure hardware with the engineers' help.

Internet tariff parameters including traffic package and bandwidth size can be changed independently. Changing to another tariff requires no extra payment.

See the rest here:
How bare metal servers enhance the work of online media - http://www.computing.co.uk

Wells Fargo has a new virtual assistant in the works named Fargo – CNBC

A Wells Fargo logo is seen at the SIBOS banking and financial conference in Toronto

Chris Helgren | Reuters

Wells Fargo is developing a virtual assistant to help it convert more retail banking customers into digital users, CNBC has learned.

The assistant, named Fargo, will be able to execute tasks including paying bills, sending money and offering transaction details and budgeting advice, according to Michelle Moore, the bank's consumer digital head. It's expected to be out next year after the bank releases a revamped mobile app and website in early 2022, she said.

The move by Wells Fargo, a consumer banking giant with more branches than any lender except JPMorgan Chase, is part of a broader technology overhaul under CEO Charles Scharf. Updating the bank's aging systems has been a priority for Scharf since becoming chief executive two years ago, as well as a key part of the turnaround needed after the bank's 2016 fake accounts scandal. Last month, Wells Fargo announced a decade-long plan to move computing to Google and Microsoft cloud servers.

Michelle Moore input Consumer Digital head at Wells Fargo

Source: Wells Fargo

"Everyone lives on their phone, and there's an expectation on how things should work," Moore said in a Zoom interview. "Our clients were telling us that our app was not easy to use, it's not intuitive, there were too many dead ends and clients were getting stuck."

While it had the most extensive brick-and-mortar presence of any U.S. bank for years, only being eclipsed in branch count last quarter by JPMorgan, Wells Fargo trails rivals in digital adoption. Regulators have criticized the firm's technology systems, and a 2019 mishap at a Minnesota data center knocked out customers' mobile and web access for hours.

Its 27 million active mobile users are fewer than those of JPMorgan and Bank of America. Despite the boost that the coronavirus pandemic provided for all things digital, Wells Fargo's 4.2% user growth in the past year is less than half JPMorgan's gains. Studies have shown that digital users are typically more satisfied with their banks, cheaper to serve and less likely to switch providers.

That's probably why Wells Fargo recruited Moore late last year. She is a Bank of America technology veteran who helped develop the company's own virtual assistant, known as Erica. That artificial intelligence-powered service has seen its use surge during the pandemic, tripling the number of interactions to 104.6 million in the past year, Bank of America said this month.

Early this year, Wells Fargo began studying why customers resorted to calling phone help lines and where the bank's app failed them, Moore said. She added that the redesigned app has a simpler login and consolidates payment options, whereas previously they were scattered throughout. Moore also said that future versions will be more capable, as part of the company's new digital-first efforts.

Wells Fargo revamped banking app.

Source: Wells Fargo

"We can help clients really live their lives and be more than checking balances and moving money," Moore said. "We want to be integrated and we want to help clients do their investments or buy their first house."

As for the name of the bank's virtual assistant, Moore said it was an obvious choice.

"We weren't trying to create a new brand or persona here," she said. "There's a lot you can do with 'Fargo.' Flip the word around, you can 'Go Far.' Let Fargo take you far."

Go here to read the rest:
Wells Fargo has a new virtual assistant in the works named Fargo - CNBC

Facebook is spending more, and these companies are getting the money – MarketWatch

Facebook Inc. plans a spending spree for next year that could give a boost to networking providers and chip companies.

The social-media giant disclosed Monday that it expects capital expenditures of $29 billion to $34 billion in 2022, up from an estimated $19 billion in 2021. The 2022 forecast came in significantly above the FactSet consensus of $23.3 billion from prior to Facebooks FB, -3.92% Monday afternoon earnings call.

Dont miss: Facebook offers a needed dose of pain relief in the face of Apple privacy challenges

The anticipated bump is driven by our investments in data centers, servers, network infrastructure and office facilities, Chief Financial Officer David Wehner said on the call. One big factor behind the spending growth is increased investments in artificial intelligence and machine learning, according to Wehner, as Facebook looks to enhance its recommendations and ad performance.

See also: Facebook earnings top $9 billion, but Apple change puts sales in the hot seat

Several big hardware and infrastructure companies could gain as Facebook opens its wallet, analysts say. Shares of Arista Networks Inc. ANET, +4.55% are up 4.3% in Tuesday trading, while shares of Cisco Systems Inc. CSCO, +1.22% are up 1.9%, with both seen as potential beneficiaries.

This is a very positive read for Arista as Facebook is one of the two cloud titans that account for a large portion of Arista revenue, Evercore ISI analyst Amit Daryanani wrote in a note to clients. This is also a positive for Cisco, to a lesser extent, as we think they may gain some share at Facebook in the 2022/23 time frame.

Wells Fargos Aaron Rakers agreed that Arista could benefit from Facebooks heightened spending as the company has been Aristas second-largest customer after Microsoft Corp. MSFT, +0.64%

Facebook has come in a few billion dollars below its capital-expenditure forecasts in recent years, according Rakers, though he still sees the companys commentary as one notable positive derivative data point as it relates to cloud capex spending trends into next year.

He was also encouraged by Facebooks commentary around its plans to spend up on solutions that help with artificial intelligence, a trend that could help Nvidia Corp. NVDA, +6.70%

Recently a semiconductor industry analyst had noted that Facebook would be going all in on NVIDIA GPUs vs. using Intels Habana solutions, while also noting that it expects to deploy Intels Mount Evans IPUs (Infrastructure Processing Units) potentially in every server, Rakers wrote.

He added that the companys discussion should be considered a positive for the overall server CPU market but that its particularly interesting in light of continued expectations that AMD could announce Facebook as a new meaningful customer with its third generation EPYC Milan CPUs going forward.

Nvidia shares are up 6.3% in Tuesdays session, and the company passed the $600-billion market-capitalization threshold for the first time in intraday trading. Shares of Advanced Micro Devices Inc. AMD, +0.47% are up 1.0%.

Facebooks outlook also bodes well for storage companies, several analysts said. Susquehanna analyst Mehdi Hosseini saw the forecast as consistent with his expectation that server and storage builds will show better-than-seasonal trends in the first half of 2022.

Wells Fargos Rakers said that the commentary remains a positive for the HDD [hard-disk drive] industry (WDC & STX) with cloud- driven nearline HDDs now accounting for ~60%+ of total HDD industry revenue.

Western Digital Corp. WDC, -0.90% shares are off 0.7% in Tuesday trading, while Seagate Technology Holdings PLC STX, -1.22% shares are down 2.7%.

Here is the original post:
Facebook is spending more, and these companies are getting the money - MarketWatch

Cybersecurity to server admin in Linux with this 12-course bundle – BleepingComputer

By BleepingComputer Deals

Most websites today use servers that run Linux. You can also find this open-source OS in cybersecurity, and virtualized in the cloud.

In other words, if you plan to work in technology, there are many good reasons to learn Linux.

The Complete 2021 Learn Linux Bundle helps you master the system, with 12 full-length video courses working towards a respected certification. The training is worth $3,540 in total, but you can get it today for only $59 at Bleeping Computer Deals.

Aside from the fact that Linux is free, there are several reasons why this operating system is popular. First, its secure. Every cybersecurity professional runs Kali Linux on desktop. Second, its flexible. There are no walled gardens here. Third, Linux was made for hacking.

This bundle helps you to explore all three key features, and learn valuable tech skills along the way. You get 120 hours of hands-on tutorials in total, delivered by genuine Linux experts.

You start with the fundamentals: how to install Linux, choose your distro, navigate the OS, and use popular apps. The training then shows you how to configure Linux in the cloud, and you pick up key server admin knowledge.

Other courses focus on Linux automation with scripting, and important security techniques. Just as importantly, you get full prep for the CompTIA Linux+ exam. This will impress recruiters around the world.

All the courses come from iCollege, an online learning platform that has helped IT professionals in 120 countries since 2003.

Order today for just $59 to get lifetime access to all 12 courses, and save over $3,400 on the training!

Prices subject to change.

Disclosure: This is a StackCommerce deal in partnership with BleepingComputer.com. In order to participate in this deal or giveaway you are required to register an account in our StackCommerce store. To learn more about how StackCommerce handles your registration information please see the StackCommerce Privacy Policy. Furthermore, BleepingComputer.com earns a commission for every sale made through StackCommerce.

Link:
Cybersecurity to server admin in Linux with this 12-course bundle - BleepingComputer

"Unified Technology Solution" – An InfoNetworks Service that Delivers Managed IT & Network Security Plus Voice and Internet Solutions -…

LOS ANGELES, October 26, 2021--(BUSINESS WIRE)--InfoNetworks today announced a new and unique service called "Unified Technology Solution." Promoted as the answer to fill an existing void in the marketplace, InfoNetworks Unified Technology Solution offers businesses managed IT services, complete network security, voice and telephony services, and connectivity via a complete package from a single provider.

For more than a year, businesses worldwide have faced unprecedented global events that are dictating policies and procedures. Companies have necessarily cut key budget items, face new challenges, and manage their businesses with reduced workforce. Many of these organizations have been tasked with creating remote infrastructure to help mitigate the ever-changing landscape and support work-from-home or hybrid work environments.

InfoNetworks Unified Technology Solution is designed to address these challenges with an all-inclusive platform that allows employees, managers, and executives to stay connected and secure both in the office and remotely. InfoNetworks data connections support the added influx of traffic to the office while the included cloud-based PBX allows for extensions to be accessible via mobile device or laptop. The Unified Technology Solution network supports a mix of Desktop, Softphones, Teams, SIP and PRI interfaces. All technologies are managed by InfoNetworks experienced Technical Support and Network Engineering Teams and are monitored 24 hours a day, seven days a week by the watchful eye of CyberSecure(SM), an advanced Network Security Software capable of locking-down up to 500,000 end points.

"Our Unified Technology Solution is a four-pronged approach," said Bruce Hakimi, Senior Executive at InfoNetworks. "By delivering Managed IT, Network Security, Voice and Data under one source, we can maximize the efficiency and productivity of any organization." He further explained: "By being able to oversee all network elements from the data connection to internal Local or Cloud based Network, InfoNetworks has the advantage of acting and resolving issues quickly without having to wait for other vendors."

Story continues

Although some data carriers may offer a cloud infrastructure, it is not a true Managed IT service. Their support is mostly limited to their equipment and servers and does not cover software applications, internal equipment such as PCs, Laptops, Printers, scanners, WiFi Routers and internal network security. If a printer is not working, a server is down or a laptop is hacked, their help desk will not assist. InfoNetworks Unified Technology Solution offers full LAN support giving businesses the advantages of having an IT team at their fingertips without the overhead cost.

"It is like having an in-house IT Department that manages and maintains your entire network, from your voice services to your laptops," said Hakimi. "Just think about it: how many companies can direct you to one support number for every type of trouble on your platform from your internet being down to an issue with your Network Security?"

View source version on businesswire.com: https://www.businesswire.com/news/home/20211026005480/en/

Contacts

Francesca Avincola - InfoNetworks Media Relationsfrancesca.avincola@infonetworks.com 310-203-9900 Ext. 103

Read more:
"Unified Technology Solution" - An InfoNetworks Service that Delivers Managed IT & Network Security Plus Voice and Internet Solutions -...