Category Archives: Cloud Servers
Why Financial Market Data is Moving to The Cloud – Institutional Investor
By Russell Blinch for CME Group
AT A GLANCE
Finding a market price used to mean tilting your head up to a big chalkboard hanging over a trading floor. Perhaps runners developed the first Application Program Interfaces (APIs) back in the 1800s when they donned fur sleeves to prevent any accidental erasure of the precious info.
But market information migrated from carrier pigeons, to blackboards, to newspapers, and then to servers. All because of a need for more and more quality market data to feed investors hungry for intelligence.
As the need for reliable data continues to grow, all types of market participants are looking for ways to make data management more cost-effective and efficient.
To find this nirvana, the industry is increasingly looking heavenwardto the cloud.
A Financial Fork in The Road
Cloud technology is hardly a new concept. However, the cloud is a major fork in the road for everyone in the financial sector due to a nexus of factors, not least of which is the adjustment to a work-from-home environment in 2020.
Especially for firms who may not have the need, expertise or the budget for onsite infrastructure, servers or data storage, there is a growing interest in cloud-based market data solutions.
Big Data Meets Dynamic Data
This is an inflection point, declared Bill Bierds, President of the software house BCC Group, in a video presentation. I think you are going to see a lot more acceleration into the cloud and market data is just one area that is ripe and ready to go.
The cloud storage market, underpinned by tech giants such as Google, Microsoft and Amazon, is expected to triple in value to over $100 billion annually by 2024, according to a report by Market Data Forecast.
All kinds of financial data are floating upwards, but market pricing data is expected to be a major growth area for the cloud, according to industry experts.
CME Group became the first derivatives exchange to offer a real-time, cloud-based market data product when it announced the launch of CME Smart Stream in 2019. This provided clients access to its real-time data feedin conjunction with the Google Cloud Platform. In July 2020, CME Group announced clients could also tap delayed data through the cloud.
They can also now access a mind-boggling 450 terabytes of data with a few clicks of a mouse, accessing hardware and provisioning network equipment in remote data centers.
Our adoption of cloud technologies was driven by our mission to make our data more accessible to a broader base of customers, without compromising on quality, said Trey Berre, Global Head of CME Data Services.
Now, more than ever, we know many market participants are looking for solutions that enable greater flexibility. Our collaboration with Google Cloud has made it easier for our clients to access the data they need from anywhere with an internet connection.
Democratization of Data
Large institutions arent the only ones levitating to the cloud.
Fintech companies are driving innovation that could mean a levelling of the playing field where small, nimble firms will take advantage of the cloud and win customers. Start-ups leveraging the cloud now, have access to as many resources and information as large financial institutions infinite computing and storage, unlimited global data movement and access to next-generation technology like quantum computing.
Said Bierds: A lot of the sell-side firms have tons and tons of resources and places to do all this analytics and algorithmic work. But a lot of the small firms dont have that. The cloud is bringing thatits democratization of data thats happening here.
A case in point is Skew Ltd, a crypto trading platform based in London, which was founded in 2018 by two former JP Morgan and Citigroup derivatives traders, Emmanuel Goh and Tim Noat. The company raised $7 million in funding and has received backing from leading investors such as Kleiner Perkins, Digital Currency Group and Octopus Ventures.
Skew announced in April 2020, that it was launching a trade execution platform in conjunction with Kyte Broking and became one of the first companies to leverage CME Smart Stream to power its solution.
Skew offers both trading and analytics and provides an institutional-grade overview of crypto derivatives markets, with over 100 charts on futures and options and historical data.
Skews client focus is on corporations and institutions where market participants tap Skews dashboards daily. According to the company, more than 1,000 firms have signed up for the market data analytics service since its launch last year.
Tim Noat, Skews COO, said one highlight of working with CME Smart Stream was how cost-effective it proved for a company their size, all resulting in winning that extra step in a competitive field.
On the business side of things, the main reason is the cost, Noat said in an interview. And not only is it cheaper to receive the data from the cloud, it is also convenient for us who are in the cryptocurrency industry.
We were among the first to get in touch with CME, build our relationships with the teams, and get early access to the cloud, which made it possible for us to have a competitive edge.
Noat and Daniel Cullender, Skews Chief Technology Officer, also highlighted the importance of the cloud for speedy execution and the ability to quickly deliver their services to customers. Startups need to move fast to build viable services for customers, an ability cloud services allow. From their first conversation with CME Group, Skew was onboarded and using Smart Stream within just six weeks.
Cullender extolled the relative ease of aligning with CME Groups offering and praised the ready tech support from Google. Other benefits include advanced monitoring and setup of alerts with no agonizing over co-locating data in expensive installations.
And that means the costs are kept very, very low, he said.
Noat said while virtually everyone in the crypto space is in the cloud, the movement to embrace the technology in the rest of industry is underway. But it will not happen overnight. So I think its going to come, but its going to take time, he said.
The worlds major stock and derivative exchanges are moving into the cloud, likely to meet client demand. According to research by Greenwich Associates, some 93% of market data professionals say they are planning to use the cloud to help manage their data.
Cloud in the Time of Virus
Other experts agree the march to the cloud is a process. But its being kickstarted by the shift to remote work during the current pandemic.
I think this a very strange time, and this very sad time in our history is forcing leaders to rethink the cloud, and to rethink where their workload is running and who is running their workload, said Bierds of BCC.
For Skew, Noat said the company has been helped by the fact their operation was on the cloud from day one.
Covid has opened our eyes on the ability of our employees and everyone to keep focused on the goal. Productivity has actually been very good.
Stephane Dubois, CEO of Xignite, told the Finextra financial portal that in the old data paradigm companies were spending 90% of their budgets of maintaining their data infrastructure. Only the remaining 10% went toward innovation.
Now, we can access this data very easily and you can really build new user experiences, new analytics, even new AI systems and models with the data, he said.
Read more articles like this at OpenMarkets
Link:
Why Financial Market Data is Moving to The Cloud - Institutional Investor
What Is QuickBooks Server Hosting and Who Should Use It? – Loop21
Whenwe think of hosting QuickBooks in the cloud, we mostly refer to the publicclouds, where setup is mutual and shared by many other clients globally, viathe internet. Cloud technology offers many benefits for accountants and CPAs,as it provides great value and reduces costs. However, security issues are stillthere to worry. This is the major reason for a sudden increase in the number ofaccounting firms opting for private clouds. Businesses that need more securitywould need to invest in private clouds. However, they should also ensure thatthey can trust the performance and availability of services.
QuickBooks server hosting is a form of cloud computing that is used by only one company or a person who can host their QuickBooks software on a distant server that is isolated from others. QuickBooks Enterprise Cloud Hosting offers great performance and a scalable setup which combines accessibility, safety, and time-efficient systems.
Belowmentioned are some of the benefits offered by the QuickBooks cloud server.
QuickBookson a private cloud provides a great level of protection as compared to aregular version. All the data and records are saved and managed on servers towhich no other company can have access. With high-end security, firewall rules,and updated anti-virus, you are assured that your data is in safe hands.
As thesoftware is hosted on the cloud, systems in the organization are not needed tobe updated on a day to day basis. To operate QuickBooks, employees only needaccess to the internet. Another advantage of hosting QuickBooks on the cloud isits increase in performance. As the hardware of the office computers is notused to run the software, systems in the origination are less loaded and runeffortlessly.
Taking hosting services from a hosting service provider also plays an important role in decreasing the overall operation cost. This can be done by hosting multiple software that is used in origination to the cloud. For example, the same server can be used to host QuickBooks and Office365 for a organization. A simple IMAP migration to Office 365 can be used to host Office 365 in the cloud.
QuickBookscloud hosting allows you to customize the service according to your businessneeds. It can be customized for different objectives, sizes, and otherimportant factors. Determine your businesss requirements and decide what worksbest for you depending on the scalability, security, and availabilityrequirements.
Therefore, QuickBooks cloud hostingbecomes the best option for organizations that can benefit from the betterflexibility of a virtualized environment. It improves performance as theservers have more computing power for running heavier applications.
Want to switch to hosted QuickBooks Desktopand want the experience the best services? Reach to Apps4Rent Today!
Apps4Rent is one of the best cloud hosting providers who has been in this field for the past 15 years. With quality services and a dedicated IT and support team, it is assured that you never face any problems with cloud hosting. We also provide virtual machine services, to get an Azure Windows Virtual Desktop Pricing, contact Apps4Rent.com and get your dream Desktop today!
See the original post:
What Is QuickBooks Server Hosting and Who Should Use It? - Loop21
Why Snowflake Is Freezing Out Its Competition – Morning Brew
Sherlock Holmes stored info in his mind palace; Snowflake stores info in a cloud-based data warehouse.
But its no mystery that youve been hearing the latters name so much this week: Snowflake raised almost $3.4 billion Wednesday, making it the highest-valued software startup IPO in history.
Snowflakes business comes down to one thing: storing a massive amount of data on cloud servers. Companies across virtually every industry are investing in data analytics, and as they collect more and more info, they need somewhere to put it.
In that sectorwhich includes giants like Amazon Redshift and Google BigQuerySnowflake is freezing out its competition. As the spaces indie frontrunner, it has a rep for being low-maintenance and offering easy scalability.
Big picture: Were not going to get into the data is the new oil debate, but its clear that data-driven decision-making is here to stay...and grow.
See the original post:
Why Snowflake Is Freezing Out Its Competition - Morning Brew
Low latency is the name of the game for IoT. Can your network keep up? – Techgoondu
Brought to you by SPTel
Many of todays transformation efforts, from digitalised factory floors to smart retail outlets, require the use of sensors and other connected Internet of Things (IoT) devices to feed them data for analysis and decision making.
The connected machines in a factory, for example, could send data on the yield of a manufacturing process, allowing managers to adjust its configuration to improve efficiency and output.
At a smart retail outlet, sensors could help track footfall as well as the items that attract the most interest, thus enabling a store manager to proactively stock up on items that are moving fast.
However, in many transformation stories, the challenges involved are often left unsaid.
Fast IoT deployment, efficient data collection and high application performance all need to be weighed and measured against time and cost.
Getting IoT deployed to automate your business operations may be easier said than done, with the hassle of managing multiple parties for the device gateway, management platform, computing resource and connectivity. This can also result in hefty investment and subject your business to additional CAPEX.
Then there is the problem of device performance. An on-premise solution will have the lowest latency but is expensive to deploy and your costs only increase with multiple location use cases.
Some businesses may choose to process all their data on the cloud to lower their upfront investment and operating cost. However, this increases latency as the cloud servers may be hosted in a location that is significantly farther away.
Thats not to mention the additional cost of bandwidth required for transmitting data intensive solutions, such as video analytics, to cloud resources.
Consider the example of a simple face recognition sensor at a gate. The processing of the image data as well as the matching of IDs is best done near the edge of the network instead of a round trip to a cloud server. This is where a fast connection at the edge is beneficial.
In light of these challenges, how can organisations move towards IoT adoption while resolving performance and cost issues?
SPTels IoT as a Service (IoT-a-a-S)
With SPTels state-of-the-art network and award winning IoT-a-a-S solution, IoT deployment is made simple.
This one-stop solution removes the hassle of managing multiple parties. Simply provide the sensor, connect it to their device management platform and leverage their resources for computing and connectivity needs.
Whats more, SPTels offering is available as a subscription service, the first in Singapore. With it, a business can deploy IoT easily and quickly without hefty upfront investment in technology that it may not be able to maintain, manage and update.
IoT devices can also connect to edge computing resources for low-latency performance without having to invest heavily in on-premise solutions.
Because of SPTels pervasive hubs around the island for edge computing, customers can be assured that computing is done closer to the data source.
This means a shorter distance while also reducing the need for extensive backhaul to the cloud or data centre, lowering bandwidth requirements.
Businesses will also benefit from seamless data processing with compute, storage and analysis all on SPTelsIoT-a-a-S.
This multiprotocol and open API platform ensures interoperability across sensors and gateway requirements to provide holistic and integrated IoT deployment and operations management.
Try out SPTels IoT service
There is wide variety of applications that SPTels offering can work with because the platform is IoT protocol agnostic.
These could include public safety and security systems, transport, public amenities such as street lighting, energy and water resource management, lift monitoring and other facilities management services.
No matter the application, SPTel will be the single provider to take care of the requirements, end-to-end. This includes application enablement, device access, data collection, processing, storage, and network transport.
For now, SPTels IoT-a-a-S is available asa free trial, with 50 per cent subsidy on pilot feesfor a limited time.
It gives IoT service providers an all-in-one solution for device management, edge computing, IoT gateway (Wi-Fi and LoRa) and backhaul connectivity to support their operational requirements for application hosting.
Supported and funded byEnterprise Singapore, this introductory programmewill serve as an innovation incubator, providing aspiring partners, SMEs and enterprises a ready platform for IoT deployment and testing at a subsidised rate.
For businesses that are unsure where to start to transform their operations with IoT, there are a number of resellers that work closely with SPTel to provide solutions through their IoT-a-a-S offering.
With expertise in a wide variety of verticals, including lift management and pest control, these partners bring industry expertise along with theaward-winning technologythat SPTel provides.
In other words, theres never beenan easier way to get started with an IoT deployment today. SPTels IoT-a-a-S is the first step for many businesses looking to a digital future.
To find out more about SPTels IoT-a-a-S programme, download the whitepaper here. Sign up here for complimentary webinars as part of SPTels virtualevent.
Original post:
Low latency is the name of the game for IoT. Can your network keep up? - Techgoondu
Best things to look at in a VPS web hosting – Techiexpert.com – TechiExpert.com
The use of VPS to run your website is one of the significant decisions you need to make. Once you have decided, the next step is to choose the right VPS website host to help you.
Getting the right VPS provider is not such difficult if you know the things to look at. Below are the things to look at
RAM
This is the servers memory capacity, and it is one of the essential things you need to consider on your VPS host. The RAM capacity affects the servers speed, the power it uses while running and how fast your website will load.
If you have various websites you want to run, then it means that you need a VPS that has a huge RAM to support them. To ensure you have the best RAM inquire if their servers support Burstable or do backup RAM. HostArmada VPS uses the servers that backup using cloud hosting.
Platform
It is good to know if you need a Linux-based system or window-based platform system. However, you can go the extra mile and seek VMware or any advanced management systems that allow you to do more with your VPS.
Control panel
The control panel is right when you need the admin to access your website. Various control panel options are user-friendly; some are WHM, cPanel, and Plesk. The system is sound because if you are new on the website, the admin can help you solve the problems you might be facing.
Customer support
The customer support that you receive from any web hosting is very vital. When something wrong happens to your server, your website might crash. And when the problem is reoccurring now and again, it might cost you so much.
From this, customer support becomes essential because they will help to come in and solve the problem.
Before you choose the VPS host, email, or phone them to determine their response, note, note how long they take to reply to you. The service you receive here will indicate how they will attend to you once you choose them as our VPS host.
Flexibility
The VPS hosts flexibility isone amongthe simplestbelongings yougot tocheck outwhen choosing your company. Are youready toupgrade or downgrade your website with resources whenthe necessityarises?check outhow flexiblethe corporateis to your server.Alwaysattempt toinquire how easyitsto varyyour services wheneveryou would liketo try to tosoand the waylongitlltake.thanks tothe variedparameters,itsgoodto settle onthe VPS hostthatsflexible.this mayallow youto formsome changeswhich willboost your business.
Final thoughtThe use of VPS web hostingmay be agreat idea. VPS web hosting isone amongthe simplestweb hostingthatswithin themarket. Youneed toskillsto settle onthe simplesthostwhich willenable youto understandyour goal.
Read more from the original source:
Best things to look at in a VPS web hosting - Techiexpert.com - TechiExpert.com
Global Cloud Based Collaboration Software Market : Industry Analysis and Forecast (2020-2027) By Deployment Type, Service, Solution, Enterprise Size,…
Global Cloud Based Collaboration Software Market was valued at US$ XX Bn in 2019 and is expected to reach US$ XX Bn by 2027, at a CAGR of XX% during a forecast period.
Cloud-based collaboration software rises the workforce productivity as the documents can be easily accessible from anytime and anywhere. Cloud-based collaboration software plays an important role in the software industry, as it is especially popular with software developers supporting source code and holding formatting of several kinds of programming languages.
The report study has analyzed revenue impact of covid-19 pandemic on the sales revenue of market leaders, market followers and disrupters in the report and same is reflected in our analysis.
The objective of the report is to present a comprehensive assessment of the market and contains thoughtful insights, facts, historical data, industry-validated market data and projections with a suitable set of assumptions and methodology. The report also helps in understanding cloud-based collaboration software market dynamics, structure by identifying and analyzing the market segments and project the global market size.
Further, report also focuses on competitive analysis of key players by product, price, financial position, product portfolio, growth strategies, and regional presence. The report also provides PEST analysis, PORTERs analysis, SWOT analysis to address questions of shareholders to prioritizing the efforts and investment in near future to emerging segment in cloud-based collaboration software market.
The driving factors of the global cloud-based collaboration software market are the increasing the number of enterprises that are implementing services of cloud-based collaboration software which includes software-as-a-service. Another factor is that it ensures that the complete documentation is reserved in the same place so that the track of different versions of attachments is kept. Low cost and workforce productivity are also fuelling the growth of the market positively. However, high internet dependency and lack of data security are hampering the market growth.
Request For View Sample Cloud Based Collaboration Software Market Report Page :@https://www.maximizemarketresearch.com/request-sample/25275
Unified communication & collaboration is expected to boost the growth of the forecast period. Unified communication & collaboration software helps to keep the documents at the same workplace, which decreases the users work to keep a track on these documents at various workplace. It allows the employees, clients, suppliers and various other users to coordinate task and share information in real time including fax, SMS, and email.
Hybrid cloud is the fastest growing segment in the forecast period. Hybrid cloud usages the features of both private and public cloud. Hybrid cloud aids in decreasing the deployment cost. Agility, security, and compliance are some of the key benefits of hybrid cloud. Startups and small to medium scale businesses adopt hybrid cloud for data recovery and for backups. Small to medium scale businesses are usually using cloud-based collaboration software which encourages their growth in the market.
Do Inquiry Before Purchasing Market Cloud Based Collaboration Software Report Here:@https://www.maximizemarketresearch.com/inquiry-before-buying/25275
Asia Pacific countries such as China, Japan, and India accounted for the fastest growing market of cloud-based collaboration software in 2017. Increasing demand for cloud servers and policies such as BYOD among enterprises aids in stimulating the growth of cloud-based collaboration software market in Asia Pacific. In Europe, the market of cloud-based collaboration software is increasing exponentially as there is rise in technology penetration and enterprises are focusing to offer better customer experience to gain competitive benefit.Scope of the Global Cloud Based Collaboration Software Market
Global Cloud Based Collaboration Software Market by Deployment Type
Public Cloud Private Cloud Hybrid CloudGlobal Cloud Based Collaboration Software Market by Service
SaaS PaaS ConsultingGlobal Cloud Based Collaboration Software Market by Solution
Unified Communication & Collaboration Document Management System Project & Team Management & Enterprise Social CollaborationGlobal Cloud Based Collaboration Software Market by Enterprise Size
Small and Medium Enterprises Large enterprisesGlobal Cloud Based Collaboration Software Market by Vertical
IT and Telecom BFSI Retail Government OthersGlobal Cloud Based Collaboration Software Market by Geography
North America Europe Asia-Pacific Middle East & Africa South AmericaKey Players operating in the Global Cloud Based Collaboration Software Market
Microsoft Corp. Google Slack Technologies Inc. Salesforce.com Inc. Jive software Inc. Mitel Networks Corporation Box Inc. Aspect software Cisco systems Oracle
Major Table Cloud Based Collaboration Software Market of Contents Report
Browse Full Report with Facts and Figures of Cloud Based Collaboration Software Market Report at:https://www.maximizemarketresearch.com/market-report/global-cloud-based-collaboration-software-market/25275/
About Us:
Maximize Market Research provides B2B and B2C market research on 20,000 high growth emerging technologies & opportunities in Chemical, Healthcare, Pharmaceuticals, Electronics & Communications, Internet of Things, Food and Beverages, Aerospace and Defense and other manufacturing sectors .
Contact info:
Name: Vikas Godage
Organization: Maximize Market Research Pvt. Ltd.
Email:sales@maximizemarketresearch.com
Contact: +919607065656 / +919607195908
Website: http://www.maximizemarketresearch.com
Follows to Twitter : Maximize Market Research at :https://twitter.com/MMR_Business
Follows to Linkdin : Maximize Market Research at :https://www.linkedin.com/company/maxmize-market-research-pvt-ltd
Follow this link:
Global Cloud Based Collaboration Software Market : Industry Analysis and Forecast (2020-2027) By Deployment Type, Service, Solution, Enterprise Size,...
Bringing the power of embedded analytics to your apps and services with Amazon QuickSight – idk.dev
In the world we live in today, companies need to quickly react to changeand to anticipate it. Customers tell us that their reliance on data has never been greater than what it is today. To improve your decision-making, you have two types of data transformation needs: data agility, the speed at which data turns into insights, and data transparency, the need to present insights to decision makers. Going forward, we expect data transformation projects to become a centerpiece in every organization, big or small.
Furthermore, applications are migrating to the cloud faster than ever. Applications need to scale quickly to potentially millions of users, have global availability, manage petabytes of data, and respond in milliseconds. Such modern applications are built with a combination of these new architecture patterns, operational models, and software delivery processes, and allow businesses to innovate faster while reducing risk, time-to-market, and total cost of ownership.
An emerging area from these two trends is to combine the power of application modernization with data transformation. This emerging trend is often called embedded analytics, and is the focus of this post.
Applications generate a high volume of structured and unstructured data. This could be clickstream data, sales data, data from IoT devices, social data, and more. Customers who are building these applications (such as software-as-a-service (SaaS) apps or enterprise portals) often tell us that their end-users find it challenging to derive meaning from this data because traditional business intelligence (BI) approaches dont always work.
Traditional BI tools live in disparate systems and require data engineering teams to provide connectivity and continous integration with the application, adding to complexity and delays in the overall process. Even after the connectivity is built, you must switch back and forth between your application and the BI tool, causing frustration and decreasing the overall pace of decision-making. Customers tell us that their development teams are constantly looking for new ways to delight their users, and embedding the BI capability directly into their applications is one of the most requested asks from their end-users.
Given the strategic importance of this capability, you can use this to differentiate and up-sell as a new service in their applications. Gartner research demonstrates that 63% of CEOs expect to adopt a product-as-a-service model in the next two years, making this a major market opportunity. For example, if you provide financial services software, you can empower users to perform detailed analysis of portfolio performance trends. An HR solution might enable managers to visualize and predict turnover rates. A supply chain management solution could embed the ability to slice and dice KPIs and better understand the efficiency of logistics routes.
The approach to building an embedded analytics capability needs to deliver on the requirements of modern applications. It must be scalable, handle large amounts of data without compromising agility, and seamlessly integrate with the applications user experience. Choosing the right methodology becomes especially important in the face of these needs.
You can build your own embedded analytics solution, but although this gives you maximum control, it has a number of disadvantages. You have to hire specialized resources (such as data engineers for building data connectivity and UX developers for building dashboards) and maintain dedicated infrastructure to manage the data processing needs of the application. This can be expensive, resource-intensive, and complex to build.
Embedding traditional BI solutions that are available in the market has limitations as well, because theyre not purpose-built for embedding use cases. Most solutions are server-based, meaning that theyre challenging to scale and require additional infrastructure setup and ongoing maintenance. These solutions also have restrictive, pay-per-server pricing, which doesnt fully meet the needs of end-users that are consuming applications or portals via a session-based usage model.
At AWS re:Invent 2019, we launched new capabilities in Amazon QuickSight that make it easy to embed analytics into your applications and portals, empowering your customers to gain deeper insights into your applications data. Unlike building your own analytics solution, which can be time-consuming and hard to scale, QuickSight allows you to quickly embed interactive dashboards and visualizations into your applications without compromising on the ability to personalize the look and feel of these new features.
QuickSight has a serverless architecture that automatically scales your applications from a few to hundreds of thousands of users without the need to build, set up, and manage your own analytics infrastructure. These capabilities allow you to deliver embedded analytics at hyperscale. So, why does hyperscale matter? Traditional BI tools run on a fixed amount of hardware resources, therefore more users, more concurrency, or more complex queries impact performance across all users, which requires you to add more capacity (leading to higher costs).
The following diagram illustrates a traditional architecture, which requires additional servers (and higher upfront cost) to scale.
With QuickSight, you have access to the power and scale of the AWS Cloud. You get auto scaled, consistent performance no matter the concurrency or scale of the userbase, and a truly pay-per-use architecture, meaning you only pay when your users access the dashboards or reports. The following diagram illustrates how QuickSight scales seamlessly with its serverless architecture, powered by the AWS cloud.
Furthermore, QuickSight enables your users to perform machine learning based insights such as anomaly detection, forecasting, and natural language queries. It also has a rich set of APIs that allow you to programmatically manage your analytics workflows, such as moving dashboards across accounts, automating deployments, and managing access for users with single sign-on (SSO).
We recently announced the launch of additional embedding capabilities that allow you to do even more with QuickSight embedded analytics. QuickSight now allows you to embed dashboard authoring within applications (such as SaaS applications and enterprise portals), allowing you to empower your end-users to create their own visualizations and reports.
These ad hoc data analysis and self-service data exploration capabilities mean you dont have to repeatedly create custom dashboards based on requests from your end-users, and can provide end-users with even greater agility and transparency with their data. This capability helps create product differentiation and up-sell opportunities within customer applications.
With this launch, QuickSight also provides namespaces, a multi-tenant capability that allows you to easily maintain data isolation while supporting multiple workloads within the same QuickSight account. For example, if youre an independent software vendor (ISV), you can now assign dedicated namespaces to different customers within the same QuickSight account. This allows you to securely manage multiple customer workloads as users (authors or readers) within one namespace, and they can only discover and share content with other users within the same namespace, without exposing any data to other parties.
Without namespaces, you could set up your own embedded dashboards for hundreds of thousands of users with QuickSight. For example, see the following dashboard for our fictional company, Oktank Analytica.
With namespaces in place, you can extend this to provide ad-hoc authoring capabilities using curated datasets specific to each customer, created and shared by the developer or ISV. See the following screenshot.
For more information about these new features, see Embed multi-tenant analytics in applications with Amazon QuickSight.
Customers are already using embedded analytics in QuickSight to great success. In this section, we share the stories of a few customers.
Blackboard is a leading EdTech company, serving higher education, K-12, business, and government clients around the world.
The recent wave in digital transformation in the global education community has made it clear that its time for a similar transformation in the education analytics tools that support that community, says Rachel Scherer, Sr. Director of Data & Analytics at Blackboard. We see a need to support learners, teachers, and leaders in education by helping to change their relationship with data and informationto reduce the distance between information and experience, between informed and acting.
A large part of this strategy involves embedding information directly where our users are collaborating, teaching, and learningproviding tools and insights that aid in assessment, draw attention to opportunities learners may be missing, and help strategic and academic leadership identify patterns and opportunities for intervention. Were particularly interested in making the experience of being informed much more intuitivefavoring insight-informed workflows and/or embedded prose over traditional visualizations that require interpretation.
By removing the step of interpretation, embedded visualizations make insights more useful and actionable. With QuickSight, we were able to deliver on our promise of embedding visualizations quickly, supporting the rapid iteration that we require, at the large scale needed to support our global user community.
For more information about Blackboards QuickSight use case, see the AWS Online Tech Talk Embedding Analytics in your Applications with Amazon QuickSight at the 25:50 mark.
Syndication Insights (SI) enables Comcasts syndicated partners to access the same level of rich data insights that Comcast uses for platform and operational improvements.
The SI platform enables partners to gain deeper business insights, such as early detection into anomalies for users, while ensuring a seamless experience through embedded, interactive reports, says Ajay Gavagal, Sr. Manager of Software Development at Comcast. From the start, scalability was a core requirement for us. We chose QuickSight as it is scalable, enabling SI to extend to multiple syndicated partners without having to provision or manage additional infrastructure. Furthermore, QuickSight provides interactive dashboards that can be easily embedded into an application. Lastly, QuickSights rich APIs abstract away a lot of functionality that would otherwise need to be custom built.
For more information about how Comcast uses QuickSight, see the AWS Online Tech Talk Embedding Analytics in your Applications with Amazon QuickSight at the 38:05 mark.
Panasonic Avionics Corporation provides customized in-flight entertainment and communications systems to more than 300 airlines worldwide.
Our cloud-based solutions collect large amounts of anonymized data that help us optimize the experience for both our airline partners and their passengers, says Anand Desikan, Director of Cloud Operations at Panasonic Avionics Corporation. We started using Amazon QuickSight to report on in-flight Wi-Fi performance, and with its rich APIs, pay-per-session pricing, and ability to scale, we quickly rolled out QuickSight dashboards to hundreds of users. The constant evolution of the platform has been impressive: ML-powered anomaly detection, Amazon SageMaker integration, embedding, theming, and cross-visual filtering. Our users consume insights via natural language narratives, which allows them to read all their information right off the dashboard with no complex interpretation needed.
EHE Health is national preventive health and primary care Center of Excellence provider system.
As a 106-year-old organization moving toward greater agility and marketplace nimbleness, we needed to drastically upgrade our ability to be transparent within our internal and external ecosystems, says David Buza, Chief Technology Officer at EHE Health. With QuickSight, we are not constrained by pre-built BI reports, and can easily customize and track the right operational metrics, such as product utilization, market penetration, and available inventory to gain a holistic view of our business. These inputs help us to understand current performance and future opportunity so that we can provide greater partnership to our clients, while delivering on our brand promise of creating healthier employee populations.
QuickSight allowed our teams to seamlessly communicate with our clientsall viewing the same information, simultaneously. QuickSights embedding capabilities, along with its secure platform, intuitive design, and flexibility, allowed us to service all stakeholdersboth internally and externally. This greater flexibility and customization allowed us to fit the clients needs seamlessly.
Where data agility and transparency are critical to business success, embedded analytics can open a universe of possibilities, and we are excited to see what our customers will do with these new capabilities.
For more resources, see the following:
See the original post here:
Bringing the power of embedded analytics to your apps and services with Amazon QuickSight - idk.dev
Actifio pushes cloud DR that accelerates slow object storage to near SSD speed – Blocks and Files
ESG has validated Actifios claim it can provide public cloud disaster recovery at near SSD speed using slower, low-cost object storage in the Google Cloud Platform (GCP).
A yet-to-be-published Microsoft SQL Server Recovery and Performance with Actifio on Google Cloud ESG technical review paper, sponsored by Actifio, examined Actifios claims.
ESG used a test bed with a 967GB SQL Server production database instance running in GCP and backing it up with Actifio to SSD-class storage, which GCP calls Persistent Disk. The backup was then replicated to a different GCP region and stored there on Google Cloud Nearline (object-class) storage.
It took Actifio 90 minutes to fully back up the database and replicate it to the second DR regions Nearline storage. A policy was set to have Actifio then back up and replicate in an incremental forever manner with application consistency.
Then ESG initiated an on-demand Actifio Sky appliance in the DR region and used Actifio Global Manager to spin up an on-demand SQL Server host there. This host mounted the backup image in the Nearllne storage through the Sky appliance as a virtual block device.
This makes the Nearline objects look like block storage to the SQL Server host, which can only operate with block storage. Without this spoofing capability the Nearline objects would have to be rehydrated to GCPs Persistent Disk storage class.
The Actifio appliance then brought the DR SQL Server instance online, 5.5 minutes after starting the DR process. Its performance was then compared to that of the production database.
With a 50:50 read/write mix, the Actifio DR instance running off Nearline storage, and with a Persistent Disk cache, provided 95 per cent of the performance of the production instance running on Persistent Disk. It provided 97.6 per cent of the production instances performance with an 80/20 read/write workload mix.
The ESG analysts then looked at the costs of this kind of DR and modelled the expected costs for a company that needed to support a 1TB SQL Server production environment over a three-year period without taking into consideration capacity and performance growth requirements or soft costs like administration. They compared the Actifio DR costs to a more traditional model of running backups and replicating full copies of the database for each DR copy.
They stated in the paper: The cost of storage for Actifio over three years is $76,080 compared to $648,000 in a traditional model. This represents a cost savings of $571,920, an 88 per cent cost reduction.
Actifios SVP for product marketing, Chandra Reddy, said that Actifio is doing for DR what Snowflake does for data warehousing: One of the basic characteristics of data warehousing is to grow 10s of TB to 100s of TBs or even PB very quickly and shrink back. This requires the use of a scalable storage layer. Snowflake chose to use cloud-based object storage such as AWS S3 or Google Cloud Storage or Azure Blob Storage.
It gets faster speed than raw cloud object storage can deliver by using a scale-out design in which each scaleout compute delivers parallel execution of parts of queries. Also: Each compute instance, known as a virtual warehouse (VW), caches data upon reading from object storage in its flash storage and memory. This ensures fast local data access with high IOPs for query execution.
The on-demand Actifio Sky appliances can be scaled out too, meaning Actifio cloudDR is essentially using the same cloud model as Snowflake.
Read more from the original source:
Actifio pushes cloud DR that accelerates slow object storage to near SSD speed - Blocks and Files
Qovery lets you deploy your application without managing your cloud infrastructure – TechCrunch
Remember how Heroku was a big breakthrough when it was first released? Qovery wants to do it again by building an abstraction layer between your code and your cloud infrastructure. You push code in a git repository and Qovery manages your services for you.
Its a container-as-a-service platform for developers. Like on Heroku, you just have to put a .qovery.yml file to describe the dependencies you need, co-founder and CEO Romaric Philogne told me.
Essentially, Qovery sits in-between your git repository and your cloud infrastructure account the company doesnt take care of cloud hosting itself. You can connect your Qovery account with your GitHub, GitLab or Bitbucket account so that it automatically gets triggered when you push new code.
After that, Qovery automatically spins up new servers, managed databases and brokers (Kafka, RabbitMQ) for you. There are some ways to automate your deployment already with Terraform and continuous integration/continuous delivery software. But Qovery makes it easy to get started.
More importantly, Qovery is building integrations with multiple cloud providers. It already works with Amazon Web Services and the team is currently working on DigitalOcean and Scaleway support. Next up, Google Cloud and Microsoft Azure are on the road map.
Interestingly, you can design your own infrastructure for each branch. For instance, if you have a development branch to try out new features or a staging branch, you can spin up new servers for this branch without having to recreate your production environment from the start.
And thats arguably Qoverys most important feature. According to the startup, cloud hosting will become commoditized. Each provider will provide managed databases, message brokers, etc. It comes down to reliability, pricing and support level. You can imagine having a production application on AWS and a development branch running on another cloud provider.
Behind the scene, Qovery relies heavily on Terraform and Kubernetes, with an additional layer on top of them. When you compare it with Herokus monolithic philosophy, it scales more efficiently, as it has been designed around micro-services from the ground up.
Qovery costs $15 per application per month. Many companies have dozens of applications running at the same time to handle different parts of a service. So if you switch everything over to Qovery, youll pay $15 for each application.
If you already have a CI tool that works with your development team, you can use it instead of Qoverys built-in CI service. And theres no lock-in effect you can stop using it if you now have your own DevOps team.
The company has raised $1 million from Techstars and a long list of business angels.
Image Credits: Qovery
The rest is here:
Qovery lets you deploy your application without managing your cloud infrastructure - TechCrunch
Not Just in the Cloud: Serverless in Your Own Data Center – Data Center Knowledge
If you follow conversations about trendy DevOps technologies, you have probably heard of serverless functions. But you may not realize that serverless functions arent just something available from public cloud providers. They can run out of on-prem or colocation data centers, using hybrid or private cloud architectures.
If you've wanted to explore serverless functions without having to depend on a public cloud provider, keep reading for an overview of how and why to deploy serverless functions in your own data center or colocation facility.
Related: Explaining Knative, the Project to Liberate Serverless from Cloud Giants
A serverless function is an application or part of an application that runs as part of serverless architecture. Developers can simply load serverless functions into a serverless hosting environment, then configure the conditions that should trigger the functions to execute.
There is no need to configure entire operating system environments or install software in the traditional sense -- hence the "serverless" label, which is somewhat of a misnomer, because the functions are still hosted on servers, even though the server environment is abstracted from end users.
Related: Cloudflare Wants to Eat AWSs Serverless Lunch
The serverless platforms that get the most attention, like Azure Functions and AWS Lambda, are public cloud services. The solutions are sometimes referred to as Functions-as-a-Service, or FaaS, because they enable users to deploy and execute serverless code using a cloud-based architecture that is similar to SaaS.
Although public cloud vendors have dominated the serverless market, there is nothing inherent in the serverless model that requires functions to be hosted in a public cloud. You can just as easily set up an environment within your own data center that allows your developers to deploy functions in a pain-free serverless way and execute them using an event-driven framework.
There are a number of reasons you may want to run serverless functions in your own data center. One is cost. Public cloud vendors charge you each time a serverless function executes, so you have a continuous ongoing expense when you use their services. If you run functions on your own hardware, most of your investment occurs upfront, when you set up the serverless environment. There is no direct cost for each function execution. Your total cost of ownership over the long term may end up being lower than it would be for an equivalent service in a public cloud.
Security is another consideration. By keeping serverless functions in your data center, you can keep all of your data and application code out of the cloud, which could help avoid certain security and compliance challenges.
Performance, too, may be better in certain situations for serverless functions that run in your own data center. For example, if the functions need to access data that is stored in your data center, running the functions in the same data center would eliminate the network bottlenecks you may face if your functions ran in the cloud but had to send or receive data from a private facility.
A final key reason to consider serverless solutions other than those available in the public cloud is that the latter services offer native support only for functions written in certain languages. Functions developed with other languages can typically be executed, but only by using wrappers, which create a performance hit. When you deploy your own serverless solution, you have a greater ability to configure how it operates and which languages it will support.
That said, the various serverless frameworks that are available for data centers have their own limitations in this respect, so you should evaluate which languages and packaging formats they support before selecting an option.
Deploying serverless functions in your own data center (or a colocation data center) is not much more complicated than running them in the public cloud. There are two main approaches to setting up a serverless architecture outside the public cloud.
The first is to run a private cloud within the data center, then deploy a serverless framework on top of it. In an OpenStack cloud, you can do this using Qinling. Kubernetes (which is not exactly a private cloud framework but is similar in that it lets you consolidate a pool of servers into a single software environment) supports Knative, Kubeless, and OpenWhisk, among other serverless frameworks.
The second approach is to use a hybrid cloud framework that allows you to run a public cloud vendor's serverless framework in your own data center. Azure Stack, Microsoft's hybrid cloud solution, supports the Azure serverless platform, and Google Anthos has a serverless integration via Cloud Run. (As for Amazons cloud, AWS Outposts, its hybrid cloud framework, does not currently offer a serverless option.)
The first approach will require more effort to set up, but it yields greater control over which serverless framework you use and how its configured. It may also better position you to achieve lower costs, because many of the serverless solutions for private clouds are open source and free to use.
On the other hand, the second approach, using a hybrid cloud solution from a public cloud vendor, will be simpler to deploy for most teams, because it does not require setting up a private cloud. It also offers the advantage of being able to deploy the same serverless functions in your data center or directly in the public cloud. A serverless function deployed via Azure Stack can be lifted and shifted with minimal effort to run on Azure Functions.
Serverless functions in the public cloud are very easy to deploy, but they do not offer the best cost, performance, or security for all types of workloads. For situations where the public cloud vendors' serverless solutions come up short, consider deploying serverless functions in your own data center or colocation facility.
Read more:
Not Just in the Cloud: Serverless in Your Own Data Center - Data Center Knowledge