Page 3,787«..1020..3,7863,7873,7883,789..3,8003,810..»

Lloyds to partner with Google Cloud – Business Insider – Business Insider Nordic

The five-year agreement will see Lloyds Banking Group work with Google Cloud to spur digital evolution and cloud transformation across the UK bank, ComputerWeeklyreports.

Business Insider Intelligence

The deal forms part of Lloyds' 3 billion ($3.9 billion) digital banking strategy and will build on the bank's multicloud approach. Lloyds intends to utilize several Google Cloud services over the course of the partnership, including Anthos, for modernizing app development; Apigee, for managing its APIs to support open banking initiatives; and other services for improving the customer experience and cloud security.

The bank's willing embrace of cloud technology runs counter to a general lack of trust in the cloud evident in the UK financial sector.A resounding 85% of UK financial sector professionals say they distrust cloud computing, and that's not necessarily due to a lack of understanding of the technology 55% say they understand it, per 2019 research from Savoy Stewart sent to Business Insider Intelligence.

This distrust could be caused by a number of factors: In the US, where a similar percentage (88%) of financial industry professionals distrust cloud technology, the primary concerns are the possibility of data leaks and a lack of control over their data.

By pushing ahead with cloud integration, Lloyds could position itself to capture the cost benefits and greater efficiency offered by cloud computing before its more reluctant rivals.Using public cloud computing can reduce or eliminate the cost of running on-premise data centers and servers.

The cloud also offers amplified storage and the ability to access a range of applications and emerging technologies like blockchain and AI. And Lloyds already seems to have plans for the latter tech: The bank is keen to explore how it can deploy AI and machine learning to deliver "the best possible personalized experience," to customers regardless of the channel they use to interface with the bank, Lloyds Group Transformation Director Zaka Mian said in a Twittervideoannouncing the Google Cloud partnership.

Want to read more stories like this one? Here's how to get access:

Read the rest here:
Lloyds to partner with Google Cloud - Business Insider - Business Insider Nordic

Read More..

What Is An Advanced Cloud? – Forbes

Clouds come in different shapes and patterns - computing ones and real ones.

Not all computing clouds are created equal. To be clear, there is no actual cloud in cloud computing. As we have said before, the cloud gets its name from the curly cartoon circle that network engineers have always used to represent clouds of connected network resources.

So naturally and logically then, not all clouds are equal because any combination of various network resources, tools, performance boosts and optimizations could be brought together to coalesce into the abstracted space that any one particular cloud instance takes up on a server hard disk, in a datacenter or in an on-premises environment.

What is an advanced cloud?

If no two clouds are necessarily the same, can we call some clouds basic standard-issue clouds and some clouds advanced clouds? In theory, we can, but this has become a very difficult topic to research because every cloud computing consultancy from here to Ougadougou either wants to call itself Advanced Cloud Services by name, or it wants to use those three words to denote the nature of its services offerings in the field.

Given this challenge, lets go ahead and define advanced cloud services (in lower case) anyway.

In the real world, enterprise organizations often find that they need to connect a variety of departments, divisions, different headquarter locations and a plethora of field personnel together. This leads to highly fragmented clouds that span multiple global locations.

Highly fragmented clouds arent naturally as good at integration, because theyre highly fragmented, obviously. Highly fragmented clouds are also harder to manage (in terms of making sure theyre all patched for updates, cleaned for data de-duplication and so on) and harder to guarantee resiliency and uptime on because theyre strung out around various servers in different data centers.

One useful notion of advanced cloud services then is highly fragmented clouds that are capable of retaining full integration, resiliency and uptime all within the boundaries of the governance and compliance restrictions they need to adhere to.

This means that advanced cloud services exist as virtual entities (as do all clouds, essentially) whose final form is built upon a shared infrastructure. Because compute clouds are virtual, we can change their shape more easily (in reality, many cloud services are sold in quite firm unchangeable form especially in the case of so-called reserved instances, but thats another story), so advanced clouds must be clouds that are eminently scalable (both upwards and downwards).

Steak & fries logic

Advanced clouds are also very much off the la carte menu i.e. we (the customer) should be able to have a bit this and a bit of that exactly how wed like it served.

In lunching terms, this means having the medium-rare steak, a side of organic tenderstem broccoli plus the triple-cooked fries and the Bernaise sauce and extra Dijon mustard. In cloud computing terms this means having offices A & B in Europe served with high-performance cloud services optimized for heavy transactional data throughput, while, at the same time, also having offices X, Y and Z in North America served with additional data storage power (office X), additional data analytics engine call capabilities (office Y) and increased memory performance (office Z).

In practical terms, advanced cloud services may be something of a fanciful notion i.e. nice to have if we could have everything the way we wanted, but pragmatically hard to pull off and practically tough to afford at an affordable price point given most Cloud Service Providers (CSPs) proclivity for charging extra for specially tuned cloud instances.

You could think of advanced in three dimensions where the first is the breadth of services and geographies offered and the second is the flexibility by which they can be combined. A cloud offering a choice of hot (fast) or cold (cheap) storage in combination with VMs in two geographies is more advanced than one offering hot storage in one geography only, and no option of cold storage for VMs. And so on. Even todays biggest cloud providers have significant restrictions in that many services are only available in a subset of geographies, and new services might only be possible to use in specific combinations. The third dimension would be how easy it is to understand and manage the services and flexibility offeredsomething that is critical to achieve secure and robust operations of a business relying the cloud, said Dan Matthews, chief technology officer at IFS.

VP of strategy at cloud and data-centric business applications company Progress is Mark Troester. Pointing out that regardless of any single organizations personal level of advancement of the cloud, Troester notes that the reality for most organizations is that they will have a combination of clouds and on-premise applications and data. So, logically then, some will be more advanced than others.

Even if a single cloud infrastructure is used for custom app capabilities, a different infrastructure for reporting and analytics will be required, not to mention SaaS apps that are on different cloud infrastructure. This reality is further complicated by a mix of access methods SQL, NoSQL, REST, proprietary APIs which stand in the way of integrating these different workloads. Organizations need to think about their API approach and consider hybrid data connectivity that leverages standard protocols for communicating across clouds or between clouds and on-premise workloads, said Progress Troester.

So the debate is open then. Clearly not all cloud are the same and many are sold as specially and specifically tuned to operate in one way or another. The industry doesnt often use the word advanced, possibly because of the whole branding-labeling issue we mentioned at the start here.

We may get to more advanced cloud offerings as the decade progresses, but for now, if youre super hungry, you might be better off just going for the set menu and extra A1 sauce.

Go here to read the rest:
What Is An Advanced Cloud? - Forbes

Read More..

Cloud Adoption Will Be On A Rise In 2020 – HostReview.com

Cloud computing is becoming more and more popular every year. The businesses across the world have realized the importance and usefulness of cloud computing, therefore, the adoption rate of clouds is rising like anything. Clouds are no longer meant for any specific type of industry or a certain sized business; rather, they are used by a wide range of companies, starting from small to medium sized firms. In fact, 2020 seems to be the perfect year for more cloud adoption. In this article, we would want to talk about 2020 trends related to cloud, and the advantages of using cloud.

As per one of the reports, it is believed that by 2020, almost 83% of enterprise workloads might be cloud based. A few of the top reasons that make cloud a favorite include betters security, agility and digital transformation. Cloud integration is a piece of cake for the businesses as they dont need any legacy infrastructures etc. All they have to do is, optimize their processes, workloads and systems in the cloud. Though, cloud migration might take some more planning, cloud integration is easy and quick.

Also, we would be able to see a lot of development in the cloud industry. The clouds will be evolved and upgraded constantly to meet the requirements of the businesses. The development will be at such a high pace, that most of us wont be predict any developments and we will be surprised by them. The businesses will focus on standardization and more compatibility, and these are the reasons they would want to adopt cloud.

Speed of the cloud will be the talk of the town

One of the key predictions or trends in 2020 is that the speed of the cloud will be pretty high. Additionally, as quantum computing will become a reality, speed will be the key focus area of the companies. The hardware will be highly advanced, therefore, the servers would be able to process at a super quick pace. As, we know that cloud computing is pretty much reliant on the speed of the network system, thus, increased speed of the servers would automatically mean better sped with the clouds too. By using cloud computing, you would be able to reduce the power consumption even if a large number of computing tasks are performed.

The businesses will be more flexible

Flexible is the key reason behind the adoption of cloud. Clouds are certainly way more flexible than the on-premises programs/computers and devices. Also, if the company has offices or people working across the globe, then cloud adoption makes it easy for them to work together. Additionally, remote working becomes easy with cloud computing as well. Thats not it, if the firm wants to expand, then with the help of the cloud, the process of expansion will be much faster. Cloud computing makes it possible for the people to access more and more resources in real-time. The processes can be scaled down when the usage is low, therefore, clouds are more cost effective than any on-premise solution. Many ecommerce firms use this flexibility to scale up for a certain period when the traffic is higher. Then they scale down whenever the traffic is less.

Always available

A large number of cloud providers are able to maintain a whopping 99.99% uptime. Therefore, there is no doubt about the fact that they are almost always available. The clouds are there, they arent like anything on-premises which has to shut down after a certain period. Clouds are there almost always. As long as you have a strong network or an internet connection, you can utilize the cloud. You can easily get and work on the applications from the cloud. In fact, you might be able to work even on some of the offline applications.

Serverless?

One of the key trends to watch out for in 2020 is serverless technologies. Back end computing will be transformed completely with serverless technologies. If serverless becomes a reality, then the cloud service providers would be offering the provisions for the businesses to execute produced codes on cloud storage. Therefore, organizations would no longer require any type of physicals serves for code production. This is a planned and strategic software development company which is aimed to lower down the cost spent on the physical server. Also, the companies wont have to spend a lot of time on maintenances. Serverless is regarded as the future distributed computing. And, there is no doubt about the fact that it will transform the way the businesses functions on the clouds.

Clouds are super easy to manage

The management of the cloud is very simple and smooth. The businesses wont have to spend a lot of time or energy or resources for the maintenance of the cloud. IT management is much more simplified, as they dont have to bother about maintaining the cloud. As, clouds do not use any sort of hardware or software infrastructure, the maintenance of the cloud isnt required. Though, the cloud service provider has to still to a lot of work on maintaining the cloud. But, the companies wont have to worry about it.

Scaling up is easier with the cloud

If the businesses are planning to grow, then they would want to adopt technologies which would help them to grow faster. Cloud computing make the businesses way more scalable. As, the companies dont have to implement and adopt new hardware or software, therefore scalability becomes easy. Additionally, if the company expands, they will just have to increase cloud computing capabilities.

Cloud Computing is not only more efficient, useful and powerful but it is also a more practical approach for the business. And, in the coming years, in 2020 and later, cloud computing will become more and more advanced. Therefore, the businesses will have to rapidly adopt clouds in order to make sure that they are at par with the market.

Visit link:
Cloud Adoption Will Be On A Rise In 2020 - HostReview.com

Read More..

Alveo U25 SmartNIC turnkey solution for the Cloud – Fudzilla

Xilinx solution to reduce cloud latency by 80 percent

The datacenter is exploding, and the number of global data center traffic will triple between 2016 and 2021. Datacenter to outside traffic doubled from two Exabytes in 2016 to four Exabytes in 2019, and by 2021 it is expected to grow to six Exabytes. Interestingly enough traffic within data centers is exploding, as it grew from five Exabytes in 2016 to 10 Exabytes in 2019, and by 2021 the expectations are that it will grow to 15 Exabytes.

Xilinx has acquired a company called SolarFlare, a specialist in SmartNIC, and aims to relieve some of the significant compute resources necessary for networking. More than 80 percent of current cloud infrastructure doesnt have SmartNIC access today. The Xilinx Alveo U25 SmartNIC platform is designed to deliver a true convergence of network, storage, and compute acceleration functions on a single device.

The Alveo U25 SmartNIC is designed to bring greater efficiency and lower Total Cost of Ownership (TCO) benefits of SmartNICs to cloud service providers, telcos, carriers, and private cloud data center operators struggling with increasing networking demands and rising costs.

The U25 combines a highly optimized SmartNIC platform with a powerful and flexible FPGA-based engine that supports full programmability and turnkey accelerated applications. The U25 delivers a comprehensive SmartNIC platform to address the industrys most challenging demands and workloads such as SDN, virtual switching, NFV, NVMe-oF, electronic trading, AI inference, video transcoding, and data analytics.

Like most Xilinx products, the U25 card is fully programmable via Vitis Unified development environment offering Turnkey applications that cover a broad set of compute problems for smaller cloud providers. SmartNIC is designed to lower the total cost of data center infrastructure, and at the same time, adding an IP to optimize extending certain functionality gives Alveo U25 a lot of leverage versus the traditional platform. Instead of talking to the CPU all the time, SmartNIC can do some tasks directly on the SmartNIC saving a lot of resources in the data center.

The Alveo U25 SmartNIC features two 10/25G ports, two PCIe Gen3x8, SFP28 Direct Attach Copper and SR Optical. It comes in the HHHL form factor. It supports Onload, TCP Direct, NETDEV, and DPDK Pool driver mode for Acceleration and Low Latency.

Baseline NIC features stateless and tunneling offloads, LSO/TSO, RSS, Checksum, SR-IOV, Multiqueue, NetQueue, and 2048 vNICs support. It supports NVMe/TCP for kernel and userspace. The FPGA programmable part includes 520K LUTs, Quad ARM A53 processor complex, and 6GB DDR4 SDRAM.

FPGA Bump-In-The-Wire Acceleration supports OVS, Encryption, Security ACLs, DPI as well as Machine Learning, Video Transcoding and Data Analytics.

Solarflare based technology have already been proven in the field. They are currently deployed in the enterprise, telecom and cloud data centers for some of the worlds most demanding customers including Nasdaq, Shanghai stock exchange and NYSE Euronext.

The platform enables bump-in-the-wire network, storage, and compute offload and acceleration functions for maximum efficiency by avoiding unnecessary data movements and CPU processing. Bump in the wire dramatically reduces the CPU burden, and reclaims resources to run more applications. Embedded ARM processors provide unique and critical control plane processing to support emerging bare metal server use cases. The baseline NIC delivers ultra-high throughput, small packet performance, and low-latency.

Standard full-featured NIC functionality and drivers, including Onload application acceleration software, can reduce latency up to 80 percent and improve transmission control protocol (TCP)-based server application efficiency by up to 400 percent in cloud-based applications.

Todays cloud infrastructures suffer from critical data bottlenecks caused by server I/O, said Donna Yasay, vice president of marketing, Data Center Group at Xilinx. With up to 30 percent of data center compute resources allocated for networking I/O processing, overhead continues to grow along with CPU cores. Xilinx is addressing the challenges resulting from the increased demands on networking by providing an easier to deploy SmartNIC with turnkey accelerated applications and out-of-the-box capabilities that go far beyond fundamental networking.

The first out-of-the-box accelerated application available on the Alveo U25 SmartNIC is support for Open vSwitch (OVS) offload and acceleration. The plug-and-play solution will offload over 90 percent of OVS processing from the server to improve packet throughput by over 5X.

Future turnkey solutions from Xilinx are planned for security functions such as IPSec, SSL/TLS, AES256/128, and distributed firewall as well as AI inference acceleration. The Alveo U25 SmartNIC is currently sampling with early access customers. General availability is expected in the third calendar quarter of 2020.

Offload turnkey solution can help Xilinx target 1.4 million servers from the legacy telecoms, small-medium business, and legacy enterprise. With Offload extensible, Xilinx has a shoot with T2 Cloud CSP + ASP customers, including CSP and Telecos, ASP (Application Server Providers), as well as Hybrid and Private cloud. The total market reaching this solution is around 6.8 million servers.

Targeting T1, Hypercloud customers including Amazon, Google, Microsoft and China BAT would require some programmable solutions on top of the Alveo U25 out of the box functionality, but it would increase the potential customer base by an additional 5 million servers.

Xilinx also introduced first OCP 3.0 Ethernet Adapter and OCP Accelerator Module and also unveiled the new XtremeScale X2562 10/25Gb Ethernet adapter card based on the OCP Spec 3.0 form factor.

Designed for high-performance electronic trading environments and enterprise data centers, the X2562 features sub-microsecond latency and high throughput with ultra-scale connectivity for real-time packet and flow information to thousands of virtual NICs.

The X2562 is currently sampling and will be generally available in the second calendar quarter of 2020.

Additionally, Xilinx announced a proof of concept for the worlds first FPGA-based Open Compute Accelerator Module (OAM). Based on the Xilinx UltraScale+ VU37P FPGA with 8GB of HBM memory and compliant with Open Accelerator Infrastructure (OAI), the mezzanine-based card supports seven 25Gbps x8 links to enable rich inter-module system topologies for distributed acceleration.

See the original post:
Alveo U25 SmartNIC turnkey solution for the Cloud - Fudzilla

Read More..

VMware embraces Kubernetes with vSphere 7 – Blocks and Files

VMware has added Kubernetes support to run containers and virtual machines simultaneously in the new vSphere release. The virtualization giant can now also offer a single management domain that covers containers and VMs in the hybrid cloud.

vSphere 7, launched today, represents the first fruits of the companys Project Pacific. Project Pacific is in turn a component of VMware parent Dells wider Tanzu initiative to enable its overall product set to build, run, manage, connect and protect containerised workloads alongside virtual machine workloads. (Read more about Tanzu deliverables, in a Dell blog.)

Deepak Patil, SVP and GM for cloud platforms andsolutions at Dell Technologies, provided a quote: As organisations look to solvefor managing their private clouds seamlessly with multiple public clouds, werenow able to extend our capabilities to both VMs and containers with a singlehybrid cloud platform.

VMware today also announced a new release of VMWareCloud Foundation, a software stack that combines vSphere, the vSAN virtual SANand NSX networking, which runs on premises and in the public cloud. The latestV4 release includes vSphere 7.0 and so can run VMs and containers at scale,according to VMware.

Dell has built a Cloud Platform system that incorporates VMware Cloud Foundation and Dell EMCs VxRail hyperconverged hardware. It now supports running simultaneous VMs and containers on Dell EMCs PowerEdge servers and some storage systems, including the Unity XT mid-range block and file arrays and the high-end PowerMax arrays. They can now provide storage for containers running in vSphere 7.0. Dell EMCs PowerProtect Data Manager for Kubernetes extends PowerProtect data protection from virtual machines to K8s-orchestrated containers.

Dells Cloud-Validated Designs cover Unity XT and PowerMax in the Dell Cloud. The company said it can qualify external NFS and Fibre Channel (FC) storage systems for VMware Cloud Foundation but has not revealed details at time of publication.

Customers can run Kubernetes on the DellTechnologies Cloud Platform within vSphere 7.0 within 30 days of vSphere 7.0sgeneral availability. Subscription pricing is available for the cloud platformsystems.

VMware traditionally virtualises servers such thata hypervisor runs the physical server and controls the execution of virtualmachines using its hardware. These virtual machines (VMs)contain anoperating system and applications.

With containerisation, a controlling softwareentity provides the operating system and its facilities while applications arebuilt as a set of microservices running in containers. These containers use thesingle set of operating system facilities and so virtualise the server moreefficiently, by not duplicating the operating system instances.

The containers are scheduled to run via anorchestration service and Googles Kubernetes (K8s) is becoming the dominantorchestrator.

Containerisation is becoming popular as a way of writing applications to run in the public cloud, so much so that they are called cloud native. As enterprises with on-premises data centres want to have a common environment for their applications across their own data centres and the public cloud they are beginning to embrace cloud-native application development.

This is at odds with the predominant on-premises application style which is to use virtualised servers, particularly with VMwarevSphere.

VMware has extended vSphere to the public cloudwith VMware Cloud Foundation to provide a single hybrid environment. Despitethis, many customers are adopting cloud-native applications and they want acommon cloud-native environment to cover their hybrid resources.

VMware has shown it can bring K8s into its hypervisor. Nutanix AHV (Acropolis HyperVisor) has its Acropolis Container Services and Karbon front end wrapper for Kubernetes. Other hypervisors, such as Red Hats KVM and Microsofts Hyper-V will surely follow suit. This will help their owners defend their virtual server base against containerisation encroachment and can be presented as helping customers embrace containerisation.

View post:
VMware embraces Kubernetes with vSphere 7 - Blocks and Files

Read More..

Supermicro Unveils MegaDC Servers – The First Commercial Off The Shelf (COTS) Systems Designed Exclusively for Hyperscale Datacenters – Associated…

Press release content from PR Newswire. The AP news staff was not involved in its creation.

Click to copy

Optimized for Large Scale, Rapid Deployment Time and Highest Performance, New MegaDC Line of Servers Supports Open Standards like OpenBMC and OCP V3.0 SFF Cards

SAN JOSE, Calif., March 10, 2020 /PRNewswire/ -- Super Micro Computer, Inc. (Nasdaq: SMCI), a global leader in enterprise computing, storage, networking solutions and green computing technology, today launched its new MegaDC line of servers the industrys first COTS systems designed exclusively for large scale deployment in hyperscale datacenters.

Supermicros breakthrough MegaDC servers are purpose-built and flexible COTS platforms specifically designed for hyperscale infrastructure deployments. By reducing the component count and optimizing the power distribution and backplane designs, MegaDC servers deliver increased cost effectiveness and reliability. For better flexibility, these new servers support open standards including OpenBMC for customized control over functionality and versioning, advanced I/O modules (AIOM) that support OCP V3.0 SFF cards, as well as common redundant power supplies (CRPS).

As we continue to rapidly expand our production capacity, Supermicro is now well-positioned to service hyperscale datacenters, said Charles Liang, President and CEO of Supermicro. With that in mind, we have designed the new MegaDC server product line exclusively for internet-scale datacenter customers. MegaDC servers are optimized to reduce deployment times and deliver optimal performance per watt and performance per dollar. We understand that large datacenters often face long lead times for upside demand as well as occasional downside challenges, and Supermicro can help alleviate these demand fluctuation concerns by maintaining healthy inventory levels for our new MegaDC servers.

Todays MegaDC launch introduces five new X11 systems comprised of two 1U systems and three 2U systems available for cloud quantity deployments with sufficient economies of scale. All of these MegaDC systems support two of the new 2nd Gen Intel Xeon Scalable processors, 16 memory slots, an AIOM slot, dual 25G Ethernet ports, and OpenBMC. Additional features include bulk packaging designed to reduce unboxing time, optimized mechanical designs to maximize airflow to the CPUs, memory and GPUs, and low-resistance 12V single-source power distribution to increase system availability and energy-efficiency.

For more information on these new Supermicro products, visit http://www.supermicro.com/MegaDC.

Follow Supermicro on Facebook and Twitter to receive their latest news and announcements.

About Super Micro Computer, Inc.

Supermicro (Nasdaq: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced Server Building Block Solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermicro is committed to protecting the environment through its We Keep IT Green initiative and provides customers with the most energy-efficient, environmentally-friendly solutions available on the market.

Supermicro, Server Building Block Solutions, and We Keep IT Green are trademarks and/or registered trademarks of Super Micro Computer, Inc.

All other brands, names and trademarks are the property of their respective owners.

SMCI-F

View original content to download multimedia: http://www.prnewswire.com/news-releases/supermicro-unveils-megadc-servers--the-first-commercial-off-the-shelf-cots-systems-designed-exclusively-for-hyperscale-datacenters-301016639.html

SOURCE Super Micro Computer, Inc.

View post:
Supermicro Unveils MegaDC Servers - The First Commercial Off The Shelf (COTS) Systems Designed Exclusively for Hyperscale Datacenters - Associated...

Read More..

COVID-19 Global Outbreaks: Coordinating Your Remote Business Operations – China Briefing

By Adam Livermore, Thomas Zhang, andChris Devonshire-Ellis, Dezan Shira & Associates

The COVID-19 outbreak in China triggered the worlds largest ever work-from-home experiment as businesses were called to consider how to effectively utilize their employees while also minimizing physical contact. That issue has now begun to impact on businesses across the world, where the infectious nature of the outbreak meant the avoidance of gathering in public, which includes staff working in offices or traveling to the workplace using public transit, in addition to the banning of other public gatherings.

The situation is still difficult in China, not all staff have yet returned to their workplace. It also appears the situation will remain with the global business community for a few months longer. So how can businesses across North America, Europe, South America, Africa, the Middle East, and Asia manage not just their own reductions in office and factory workforce, but also manage their China subsidiary operations as well?

Under these circumstances, most options for the standard operation of businesses using traditional methodologies are temporarily closed. Managers at foreign-invested entities (FIEs) in China have had to move very quickly to pick up the slack. This has meant the establishment of new workflows that mitigate staff travel while also ensuring that work timelines are met. The way employees collaborate remotely has had to be re-assessed. In this article, we look at the methodology and technical solutions when operating a business that is under lock down or needs to start preparing for the eventuality.

Businesses need to appoint a responsible person to act as a Chief Information Officer (CIO). This can be a high-stress role as the CIOhas responsibility for the infrastructure on which the remote office heavily relies, the smooth sharing of information across the organization, and the security of that information. It is vital that a CIO is able to coordinate with all staff, be aware of their location and situation, and is able to accurately disseminate company information, instructions, and assistance where required.

Foremost in the mind of the CIO at such a time should be the security of confidential corporate information. Employees are working from home. Can they rely on the existing corporate infrastructure in place to do their jobs? Is their organization reliant on centralized, physical servers located in server rooms at HQ or in data centers? If so, are your employees at home and in China able to access those servers sufficiently quickly to do their work under current circumstances?

If the company does not put in place an explicit short-term policy and take necessary measures to allow employees to do their work and meet deadlines, the inevitable consequence is that those employees will start to fall back on non-corporate channels of information exchange. In China, this generally will mean the super-app Wechat. Other options include Messenger, LinkedIn, Whatsapp, Twitter, Viber, Yandex, and many other Apps pertinent to your specific country.

The dangers of pushing corporate content through such channels hardly needs to be explained here. Companies should do all that they can to look for viable alternatives as speedily as possible.

Options fall into two general categories improving access to existing corporate information hubs, and promptly creating new corporate channels for information exchange that dont rely on the existing infrastructure.

The first line of thinking assumes that much of the critical corporate information and tools utilized by employees resides on physical infrastructure, rather than in the cloud on platforms like MS Azure / Office365. The migration of this data is not feasible in the short-term, so the best option to facilitate access to this information is to improve both the speed and the security of connecting to that data by employees working in a non-office environment.

Access via RDP (which encrypts data in transmission) can be considered for systems that are not browser-based, and which do not restrict access to office environments. For those systems that are normally restricted to office-based access, the company will need to consider putting in place VPN connections between each employees computer and a cap-stone server that can link to the relevant corporate server. Only very limited destination sources should be enabled for the VPN, to minimize the systems exposure to the internet and thereby reduce the security risk when employees connect with a wider range of non-corporate resources.

Computers operating in such remote environments should be protected by tools, including anti-virus software, as such laptops are no longer inside the security perimeter of your office environment anymore. The anti-virus software installed in your office might not even be updated promptly when laptops are outside of the office, as certain kinds of deployments would only allow updates from internal resources. BitLocker is another tool that should be considered, as the possibility of a laptop being lost or stolen increases rapidly when lots of staff work remotely. BitLocker adds an extra level of access security by encrypting information residing on the local hard disk of the computer.

We can assume that access speed will be negatively affected due to employees working outside of the normal corporate environment, because internet speeds can be expected to be lower. Our firm,Dezan Shira & Associates,has had over 300employees working from home in China these past weeks, and have found that accessing internet via 4G hotspot by connecting their laptop to their mobile phone has largely resolved this issue.

Another option available to companies is to look at immediately available alternatives to reduce their over-reliance on corporate infrastructure, which can be difficult for China and other remotely based staff to access from their homes. The most popular options, at present, have been Microsoft Office365 (for foreign companies with employees in China), Alibabas DingTalk, and Tencents WeChat Work (for Chinese companies that need a more domestic solution).

Other options, such as Slack have also seen a big surge in popularity. What these services essentially do is provide a set of corporate communication and collaboration tools that directly links to the companys global directory but that does not rely on data sitting within physical company servers. These cloud-based solutions can be more easily accessed anywhere, anytime by employees.

Taking Office365 as an example, this platform provides a wide range of functionalities to support remote working as a byproduct of its overall design. SharePoint Online (SPO) is a cloud-based tool to store and control corporate information, which can be utilized across many other applications.

Microsoft Teams is a communication and collaboration tool that synchronizes with SPO to deliver a wide range of functionality to users. Teams provides a stable option for conference calling between colleagues based both inside and outside of China. Other tools, such as Flow (now renamed as PowerAutomation) allow companies to quickly re-organize and streamline their business processes to meet the challenges of remote-based employees.

More importantly, Office365 enables the company to manage different aspects of their business on one universal platform email, file sharing and access, and collaboration are all integrated within the same platform and delivered with a high degree of reliability and security. This platform also guarantees that all the information it contains is controlled under the companys own security policies and requirements. This is one big advantage for compliance, both, under local laws and regulations and to contractual commitments to clients.

Microsoft has recently introduced a special offer for free trial of certain tools within the O365 suite of products. Obviously, their plan is to get potential clients comfortable with collaborating through this platform, and then persuade them to sign up to the paid service (with full functionality) at a later date.

Teams is being offered for free for six months and Office365 E1 package (which includes email, SPO, Flow, Teams) is being offered for free for three months.There are various permutations depending on whether your company wants to utilize the functionality on the special China version of O365, or the standard global version.

The options mentioned above represent fire-fighting plans for the short-term that can get your employees functioning efficiently again. However, with intelligent planning and implementation, they can also form part of the base for your long-term, cloud, or hybrid cloud IT infrastructure.

While the COVID-19 outbreak poses a short-term challenge for businesses, in the medium- to long-term it may actually provide the catalyst for companies to adopt the remote working concept, supplemented by the wide range of technology tools already available to increase worker efficiency. Indeed, it may herald the final nail in the coffin for the traditional industries bricks and mortar retail, the leasing of physical office environments, and even offline consumer banking.

If you are a CIO at an MNC, opportunities exist for you as well. China can potentially become your petri dish for experimentation with appropriate cloud-based tools suitable for remote working, which (if successful) can later be more easily applied to more change-resistant sectors of your organization outside of China.

Last year, China became the fifth country to commercialize 5G (after South Korea, US, Switzerland, and UK), granting 5G licenses for commercial use to four telecom operators in the country. This is going to further accelerate the adoption of digital tools and platforms that enable remote working in a secure environment.

About Us

China Briefing is written and produced by Dezan Shira & Associates. The practice assists foreign investors into China and has done since 1992 through offices in Beijing, Tianjin, Dalian, Qingdao, Shanghai, Hangzhou, Ningbo, Suzhou, Guangzhou, Dongguan, Zhongshan, Shenzhen, and Hong Kong. Please contact the firm for assistance in China at china@dezshira.com.

We also maintain offices assisting foreign investors in Vietnam, Indonesia, Singapore, The Philippines, Malaysia, and Thailand in addition to our practices in Indiaand Russia and our trade research facilities along the Belt & Road Initiative.

The rest is here:
COVID-19 Global Outbreaks: Coordinating Your Remote Business Operations - China Briefing

Read More..

How Mircom Group is using technology to turn buildings into active, networking machines – The Globe and Mail

Mircom chief technology officer Jason Falbo, left, and CEO Mark Falbo walk through the companys corporate head office in Vaughan, Ont.

Della Rollins/The Globe and Mail

The doorman of a glass-and-steel office block faces a huge, wall-mounted intercom panel. The scene, from French filmmaker Jacques Tatis 1967 film Playtime, makes it impossible to talk about the business of building intercoms and alarm systems without smiling.

Lights flash across the intercom panel like a shooting gallery. Buzzers blare. An overamplified voice blurts out. Mr. Tati famously spoofed modern conveniences. The doorman whistles to himself in a dismissive, Gallic way.

Of course, the joke today is how quaint that all seems. The alarms and building-communication systems made by a handful of multinationals, including the smaller Vaughan, Ont.-based Mircom Group of Companies, are now exponentially more complicated. They tap into cloud servers, networks and wireless apps and can be monitored extensively on-site and even remotely. They effectively turn buildings into active, networking machines.

Story continues below advertisement

And this has forced Mircom and its competitors to branch far off from their original business.

"We were really hardware-centric for a long time. We would build the housings, the components, the circuit boards, the devices, and we would supply the hardware. But like most businesses, technology started to converge, says Mark Falbo, Mircoms president and chief executive officer.

It used to be just a hardwired system with wires running through a building. Now it might be tied into a network infrastructure, an internet infrastructure whereas before you would sell the equipment and be done with it.

Back in the 1960s, when electronic alarm and intercom systems were relatively new, Marks father Tony Falbo came to Toronto from Cosenza in southern Italy and was working for Mirtone, which made home intercoms that connected rooms like the kitchen to the living room or bedroom.

Mr. Falbo eventually became part-owner of the business as it branched into fire-detection and alarm equipment. The company was sold by a majority owner in 1988. After a short non-compete period, Mr. Falbo restarted Mircom, originally an intercom division within Mirtone. He took that division with him and grew it by designing and manufacturing an array of alarms, control panels and now software-based emergency systems. The company is currently run by Mr. Falbos three sons and has around 500 employees competing against the multinationals.

"The fire-detection part of the business is an oligopoly. Its dotted by five large companies Siemens, Honeywell, Johnson Controls and United Technologies Corp. And in Canada, its Mircom, says Mark, the oldest of the three sons. So, by rights, Mircom is the only company of its kind in Canada thats designing, engineering and manufacturing. We have the multinational competitors operating here, but Mircom thinks Canada first.

By that, he means a focus in particular on Canadian building codes. For instance, Canadian regulations tend to be geared more toward emergency crews getting immediate information when they arrive. Large LED panels and graphic displays in the lobby typically provide crisis information more readily than LCD control panels, which can involve scrolling through menus and are more common in the U.S.

Story continues below advertisement

But LED panels can take up a lot more space. So control panels are now morphing into software systems or, in industry jargon, single-pane-of-glass systems with computer and smartphone screens increasingly replacing control panels.

Now we're also using computer workstations, mobile apps and other interfaces to receive that information, says Jason Falbo, chief technology officer and the youngest brother. (The middle of the three brothers, Rick Falbo, manages sales and business development.) Workstations and control panels can be in development for five years or so given their complexity, and larger systems may have as much as four million lines of computer code.

Its a lot closer to an airplane control system. You cant even compare it to a lighting panel or something like that. Theyre hugely complex. Theyre beasts, Jason says.

This rampant technology has pushed the company to expand its product lines and services, although fire-alarm equipment remains around 60 per cent of the business. Fire-equipment servicing is around 15 to 20 per cent. The remainder is intercom and other automated systems. The privately owned company surpassed the nine-figure revenue mark a few years ago.

"Twenty years ago, youd have Mircom supplying the fire-detection equipment, the hardware and the parts that detect and the alarm. Maybe Honeywell would provide access control or intercom security solutions. And potentially Johnson Control would be doing your HVAC [heating, ventilation and air conditioning] controls. Thats how we went to market, Jason says. Now many of these systems are blending through software interfaces and mass notification systems.

Yet the company occupies the unusual niche of also being an original equipment manufacturer at times for its larger competitors, filling in gaps in those companies product lines. So, this oligopoly of an industry is a bit strange, in that everybody seems to be a competitor and customer and supplier to the others, Mark says.

At the same time, the move to software-based systems has created a fundamental shift in the entire industry.

I think the pivot from a focus on fire-detection and alarm hardware to more converged building technologies with a software focus is something that we as a family and as a company really need to deal with, Mark says. Its sometimes a cultural shift. Its an investment shift. Its a priorities shift.

Still, the convenience of integrated building systems also brings myriad complications. Fire alarm systems, for instance, have to remain within their own separate loop, safeguarded against the failure of other systems. However, information from the fire system can still be monitored by other platforms.

Mike Prsa, a principal and vice-president at engineering firm Mulvey and Banani International Inc. in Toronto, says common internet protocol is necessary for alarm systems to talk to other building systems and to tie into networks and apps.

They can be integrated, but only in a secondary viewing and annunciation purpose. So I can pull data from my fire alarm system and plug it into my building-intelligence platform for reporting purposes. But I cant command and control the fire alarm system through any other system besides itself, Mr. Prsa says.

Then theres the human factor. Advanced controls and elaborate circuitry are fine, but what if the building supervisor cant work them to their full potential?

Story continues below advertisement

One of the missing links that were finding is that building operators arent necessarily trained to analyze the data and understand what to do with it, says Marianne Touchie, an assistant professor at the University of Torontos Faculty of Applied Science and Engineering.

The human element could be one of the most urgent aspects as building-communication and alarm systems become more complex. Mircom sees personal connections to customers as another inroad to compete against the larger multinationals, particularly as systems become more compatible with other systems.

Some of it is guerrilla business where you go in and do more, Mark says. We try to be a little bit less proprietary. We try not to lock people into our technology once they acquire our systems. We provide more choice and training in open platforms, he adds.

Because if the doorman or building staff cant figure out the system, then its of little use.

Go here to read the rest:
How Mircom Group is using technology to turn buildings into active, networking machines - The Globe and Mail

Read More..

Mastering the multicloud maze: how to choose the right solution for your business – UKTN

By Justin Day, CEO of Cloud Gateway

While the advent of the cloud has most definitely dawned, theres still some way to go for widespread adoption of cloud technology into enterprise.

Often Boards and C-Suite executives know that using a cloud platform can bring major benefits to their company; from efficiency and flexibility to cost reductions and scalability.

With vendors clambering to sell the best cloud package, and buzzwords such as multicloud, SASE (secure access service edge) and hybrid cloud flying about, it can be tricky for companies to understand the correct solution for their business.

Here well explore how businesses can navigate the world of cloud in order to understand what they need from a cloud platform to benefit from the technology.

The definition challenge

DeepCrawl raises $19m Series B

Since the cloud broke onto the technology scene, there has been a significant expansion in the number of platforms, business models and key terms within the space.

Cloud technology has made some serious strides in its development and will continue to do so as it becomes further ingrained with business needs and progress.

One of the main issues emerging from the increase in cloud technology offshoots, however, is the confusion around the definitions of these new models, primarily multicloud and hybrid cloud which are often used interchangeably.

Multicloud by definition actually means the use of multiple as-a-Service types to aid your business such as IaaS, PaaS and SaaS. However, it has quickly become more recognised as using multiple of the major public cloud service providers such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud, Oracle Cloud and more.

Social planning startup ZYNG closes pre-seed funding

Hybrid cloud is about using different types of cloud platforms to get a blend such as public cloud, private cloud and even virtualised on-premise.

Choosing the right answer

With so much confusion and mis-use of the terms, its no surprise that businesses may feel unsure as to which cloud ecosystem to opt for.

Traditionally, businesses who were early adopters of the cloud often chose to utilise a single public cloud system such as Azure or AWS. As a result, companies were forced to adapt their business models to ensure that the applications, data and assets they were placing on the cloud fitted the specifications of that one vendor. Being trapped with one provider has now made it more difficult for those companies to remove their apps and systems from that cloud platform to a more suitable location.

Medtech startup Inotec secures 7m funding

As a general understanding of cloud technology grows, Boards and C-Suite executives are keen to capitalise on the benefits that the technology can bring in the hopes that it will help their business remain competitive and up to market standard. This pressure can often lead businesses into rushing into their cloud transformation and opting for a cloud solution that promises the whole shiny benefits package.

While the package may work for some apps and assets in the business, some areas would be compromised so that they can fit in the chosen platforms.

It must be stressed that businesses need to give themselves enough time to properly evaluate their business and understand which apps can stay onsite, which need elasticity, and which can straddle the cloud and the companys on-premise servers.

Instead of just choosing the solution and hoping it meets the companys requirements, the business needs to understand its requirements first in order to define the solution. There are also other considerations on areas such as cost or skillset its difficult to keep having people skilled in all of the different options or it becomes unviable.

Experiencing the cloud benefits

While companies should consider their requirements on their own merits, often the most fitting solution is usually a hybrid cloud or multicloud ecosystem.

Both work well for a companys wide range of applications and workloads and will allow them to deploy those in the most suitable locations, instead of constricting them into an inappropriate location from lack of options.

A multicloud approach provides companies with peace of mind by minimising reliance on any one cloud provider, increasing flexibility and scalability. The hybrid option also lends itself to the transition state many organisations will be in as they migrate to the cloud; you cant turn everything off on Friday and be in cloud on Monday so naturally, youll be in a hybrid state.

Ultimately cloud technology can bring huge benefits to companies, cutting costs, improving efficiency and application performance and increasing flexibility and innovation.

However, in order to truly experience these benefits, companies must make sure they are using the right toolbox for the job and that comes with understanding what jobs you need to complete and the right tools to get those jobs done.

Hybrid cloud and multicloud strategies are often the best solutions for companies making the leap to cloud, allowing companies to make the most of their investment and helping to transform their business.

See the original post here:
Mastering the multicloud maze: how to choose the right solution for your business - UKTN

Read More..

The not-so-Smart Home is available now to disable the Server in the Cloud, behind the Lightify – Play Crazy Game

The decision to participate in the Smart Home Trend is that a lot of the time it is not smart, you will soon get the Lightify-for-you feel of the customers available now its. The company moved to a Server in the Cloud, on the back of the remote control of the light sources anytime soon.

As the company has announced, which, however, there is still time: 31 pm. In August 2021, and then, in about 18 months, it will be moved to the Server. Thus, a number of functions that previously were carried out by the Cloud, the connection is dropped. However, there are a number of basic functions that can be controlled directly from the Smartphone App as long as you have this one yet, from the Stores to the platform manufacturer to disappear in the next year or so.

In practical terms, this means that, for example, although it is a little bit of manual color adjustments can be made to the lamps, with a color control system is the selection of the works, though. In addition to the Lightify is not a click the light bulbs in your house of groups, flexible to allow you to, but you. The use of a timer or the Alarm and the light are also possible.

A whole list of functions that remain the same, or is to be abolished, made available now available. It is also shown that the control of the Lightify bulbs, the language is no longer the operating system for the control of Amazon or Google. On the other hand, it can be set up in all systems, the Router, using the command that is to be welcomed.

And so you can make sure that you use the Lightify products, and even more so with a wide range of functions. Because you are going to be addressed by the industry-Standard ZigBee. If youre willing to put in a System from another vendor that supports this, you can also use the lamps, and then, in the mountain, as well as a higher level of functionality.

See also: The smart home: the kitchen, the chamber, with the abduction of a child is a threatLight, Light, Light, Light, Led Bulb, Led BulbCC0 / mary1826

More:
The not-so-Smart Home is available now to disable the Server in the Cloud, behind the Lightify - Play Crazy Game

Read More..