Page 2,324«..1020..2,3232,3242,3252,326..2,3302,340..»

5 Ways That Restaurant Systems in the Cloud Unite Operations and Strategy – Hospitality Net

Recently, the capacity for restaurants and food services organizations to adapt quickly to drastic changes was sorely tested. To move as a whole organization in one accord between one paradigm and another to continue to serve guests and thrive was key to success. It remains to be that. Achieving that mission centers around efficiently managing restaurant systems in a more centralized manner.

What are some essential areas where restaurant systems in the cloud help unite your locations and brands? In what ways does cloud-based restaurant software increase your capacity to adjust and adapt as the industry landscape shifts and changes? How does it impact the way your organization creates strategy and competitive momentum? Heres a selected list of 5 to consider.

Cloud technology enables restaurants and food services organizations to get a better sense of what the winners are when it comes to menu items. Being able to strategically roll out menus based on popularity and profitability based on organization-wide data is a decided advantage. This is in comparison to on-prem solutions that tend to silo locations and make comparative reporting more involved and less agile.

Whats needed today is a better way for restaurant and food services location management, franchisees, and head offices to connect the dots to make changes as quickly and accurately as possible according to above-store analysis. Doing that helps promote greater profitability across the spectrum as quickly as is necessary in the areas of pricing, combos, specials, new items, and other essentials. The cloud enables that with timely, accurate, and simultaneous updates to help you optimize all factors that affect profitability.

Mobile-based ordering was a vital source of revenue for a spectrum of restaurant organizations over the past year, arguably more so than ever before. Even as we come out of a drastic disruption to how restaurants and food services engage with guests, the central concept of mobility being a primary revenue driver remains, with the market expected to reach $192 billion in 2025 compared to $126.91 billion today.

This is a part of a movement toward access to menus and ordering that grants guests with more control over the process to get what they want more easily from wherever they happen to be. All locations are brought into this equation by way of a central platform in the cloud. This empowers them, and strengthens the whole organization, too.

In the world of food services, a single item may appear under several menu items depending on the brand or concept. Yet, tracking those items is more efficient if it can be done in a common and centralized environment that assigns a single SKU to multiple public-facing item names. In doing that, tracking those items, related ingredients, and other vital factors becomes far easier to manage.

The cloud is designed to enable your teams to do that by its very nature. When youre enabled to see trends across a spectrum of brands and concepts in a single environment, item management is more efficient and more informative to cost-effectiveness for your whole organization.

By extension, analyzing business data of all kinds organization-wide is more straightforward in the cloud. A single technology platform that unifies solutions helps teams access the information they need in real time wherever they are to best determine performance and related strategy from a higher vantage point.

From there, your organization can better establish what the most realistic benchmarks and standards to judge growth and performance really are. Above-store reporting on a single platform adds to how adaptive your organization can be overall and in a crucial way when its time to switch gears as needed.

Managing a siloed technology structure based on on-prem servers can be a distraction to delivering a great guest experience. As such, many leading restaurant organizations are continuing digital transformation processes by partnering with SaaS providers to ensure the stability and effectiveness of technology solutions on their behalf.

The immediate advantage to this is a reduction in overhead and maintenance. Cloud and SaaS support has other implications, like deployment of new integrations that in-house IT staff no longer have to worry about. That allows your in-house technology teams more time on innovation in collaboration with technology partners to be more proactive in envisioning what the future will be for the industry and for your business.

Connecting with the right technology partners, especially in this era of transition, is top of mind for many organizations. The next step is making sure that your business finds a partner to help you transition into that new era who will be supportive for the long term. Whats the best way to approach that?

Weve authored a resource that touches on that subject, helping you to outline a framework by which to judge current partners as well as engage with new ones.

You can easily get your copy of that resource right here.

Infor is a global leader in business cloud software specialized by industry. Providing mission-critical enterprise applications to 65,000 customers in more than 175 countries, Infor software is designed to deliver more value and less risk, with more sustainable operational advantages. We empower our 17,000 employees to leverage their deep industry expertise and use data-driven insights to create, learn and adapt quickly to solve emerging business and industry challenges. Infor is committed to providing our customers with modern tools to transform their business and accelerate their own path to innovation. To learn more, please visit http://www.infor.com.

Read more from the original source:
5 Ways That Restaurant Systems in the Cloud Unite Operations and Strategy - Hospitality Net

Read More..

The rise of the cloud data warehouse – The Register

Paid Feature The cloud has a habit of transforming on-premises technologies that have existed for decades.

It absorbed business applications that used to run exclusively on local servers. It embraced the databases they relied on, presenting an alternative to costly proprietary implementations. And it has also driven new efficiencies into one of the most venerable on-premises data analytics technologies of all: the data warehouse.

Data warehousing is a huge market. Allied Market Research put it at $21.18bn in 2019, and estimates that it will more than double to $51.18bn in 2028. The projected 10.7 percent CAGR between 2020 and 2028 comes from a raw hunger for data-driven insights that we've never seen before.

It isn't as though data warehousing is a new concept. It has been around since the late eighties, when researchers began building systems that funneled operational data through to decision-making systems. They wanted that data to help strategists understand the subtle currents that made a business tick.

This product category initially targeted on-premises installations, with big iron servers capable of handling large computing workloads. Many of these systems were designed to scale up, adding more processors connected by proprietary backplanes. They were expensive to buy, complex to operate, and difficult to maintain. The upshot, AWS claims, was that companies found themselves spending a lot on these implementations and not getting enough value in return.

As companies produced more data, it became harder for these implementations to keep up. Data volumes exploded, driven not just by the increase in structured records but also by an expansion in data types. Unstructured data, ranging from social media posts to streaming IoT data, has sent storage and processing requirements soaring.

Cloud computing evolved around the same time, and AWS argues that it changed data warehousing for the better. Data Warehousing has been popular with customers in sectors like financial services and healthcare, which have been heavy analytics users.

Manage data at any scale and velocity while remaining cost effective

But the cloud has opened up the concept to far more companies thanks to lower prices and better performance, according to AWS. Applications previously restricted to multinational banks and academic labs are now open to smaller businesses. For example, youre able to perform data analytics in the cloud with benefits like scale, elasticity, time to value, cost efficiency and readily available applications.

The numbers bear this out. According to Research and Markets, the global market for data warehouse as a service (DWaaS) products will enjoy a 21.7 percent CAGR between 2021 and 2026, growing from $1.7bn to $4.5bn.

The largest cloud players have leaped on this trend, with Microsoft offering its Synapse service and Google running BigQuery. AWS announced Redshift as the first cloud data warehouse to address the market in 2012. The idea was pretty simple, AWS told us. The company wanted to give customers a scalable solution, where they could use the flexibility of the cloud to manage data at any scale and velocity while remaining cost effective.

Unlike online transaction processing databases like Amazon Aurora, Redshift targets online analytics processing (OLAP), offering support for fast queries thanks to scalable nodes with massive parallel processing (MPP) in a cluster. The cloud-based data warehouse follows the AWS managed database ethos. Rather than relying on a customer's administrators to take care of maintenance tasks, the company handles it behind the scenes in the cloud.

Aside from standing up hardware, this includes patching the software and handling backups and recovery. That means developers can focus on building applications ranging from modernizing existing data warehouse strategies through to accelerating analytics workloads, which it does using back-end parallel processing to spread queries over up to 128 nodes. Companies can use it for everything from analyzing global sales data to crunching through advertising impression metrics.

AWS also highlights other applications that can draw on cloud-based data warehouse technology, including predictive analytics, which enable companies to mine historical data for insights that could help to chart future events. Redshift also helps customers with applications that are often time critical, AWS says. These include recommendation and personalization, and fraud detection.

Performance at the right price is key, asserts AWS, which reports that customers latency requirements for processing and analyzing their data are shortening, with many wanting to make things almost real time.

AWS benchmarked Redshift against other market players and found price performance up to three times better than the alternatives. The system's ability to dynamically scale the number of nodes in a cluster helps here, as does its ability to access data in place from various sources across a data lake.

Data sharing is a cumbersome process, traditionally, where files are uploaded manually from one system and copied to another. This system, AWS says, does not provide complete and up-to-date views of the data as the manual processes introduce delays, human error and data inconsistencies, resulting in stale data and poor decisions.

In response to feedback from customers who wanted to share data at many levels to enable broad and deep insights but also minimize complexity and cost, AWS has introduced a capability that overcomes this issue.

Announced late last year, Amazon Redshift data sharing enables you to avoid copies. The new capability enables customers to query live data at their convenience, and get up to date views across organizations, customers and partners as the data is updated. In addition, Redshift integrates with AWS Data Exchange, enabling customers to easily find and subscribe to third-party data in AWS Data Exchange without extracting, transforming and loading it.

Amazon Redshift data sharing is already proving a hit with AWS customers, who are finding new use cases such as data marketplaces and workload isolation.

Data lakes have evolved as companies draw in data of different types from multiple sources. When unstructured data comes in such as machine logs, sensor data, or clickstream data from websites, you don't know about its quality or what insights you're going to find from it.

AWS told us many customers have asked for data stores where they can break free of data silos and land all of this data quickly, process it, and move it to more SLA-intensive systems for query and reporting like data warehouses and databases.

The cloud is the perfect place to put this data thanks to commodity storage. Storing data in the cloud is cheap thanks to a mixture of economies of scale on the cloud service provider side, and tiered storage that lets you put data in lower-cost tiers such as S3.

Data gravity is the other driver. A lot of data today begins in the cloud whether it comes from social media, machine logs, or cloud-based business software. It makes little sense to move that data from the cloud to on-premises applications for processing. Instead, why not just shorten the time it takes to get insights from it, AWS says.

The company designed the data warehouse to share information in the cloud, folding in API support for direct access. Redshift can pull in data from S3's cheap storage layer if necessary for fast, repeated processing, or it can access it in place. It also features different types of nodes optimized for storage or compute. It can interact with data in Amazon's Aurora cloud-native relational database, and other relational databases via Amazon Relational Database Services (RDS).

It also includes support for other interface types. Developers can import and export data from other data warehousing systems using open data formats like Parquet and optimized row columnar (ORC). Client applications also access the system via standard SQL, ODBC, or JDBC interfaces, making it easy to connect with business intelligence and analytics tools.

The ability to scale the storage layer separately to the compute nodes makes the system more flexible and eliminates network bottlenecks, the cloud service provider says.

Cloud databases also provide application developers with other services that they can use to enhance those insights. One of the most notable for AWS is its machine learning capability. ML algorithms are good at spotting probabilistic patterns in data, making them useful for analytics applications, but inference - the application of statistical models when processing new data - takes a lot of computing power. Scalable cloud computing power makes that easier, AWS says.

Cloud-based machine learning services are also easy for companies to consume because they are pluggable with data warehouses via application programming interfaces (APIs). AWS makes these available to anyone who knows SQL. Customers can use SQL statements to create and use machine learning models from data warehouse data using Redshift ML, a capability of Redshift that provides integration with Amazon SageMaker, a fully managed machine learning service.

In 2019, Amazon Redshift also introduced support for geospatial data by adding a new data type to Redshift: geometry. That supports coordinate data in table columns, making it possible to handle geospatial polygons for mapping purposes. This makes it possible to combine location information with other data types when making conventional data warehousing queries and building machine learning models for Redshift.

As data warehousing continues its move to the cloud, it shows no sign of slowing down. Customers can choose offerings from the largest cloud service providers or from third-party software vendors alike. Evaluation criteria will depend on each customer's individual strategy, but the need to scale compute and storage capabilities is sure to factor highly in any decision. One thing's for sure: the cloud will help customers as their big data gets bigger still.

This article is sponsored by AWS.

Continue reading here:
The rise of the cloud data warehouse - The Register

Read More..

How does cloud computing transforming the accounting industry? – HostReview.com

Nowadays, the impact of globalization, the speedy advances in science and technology, the increase of massive knowledge, the wide unfold of internet-based applications and even standardization have created the right context for the emergence of cloud technology. Cloud computing has an impact on many industries and businesses; one of the other affected areas is the accounting sector. The focus of this article is to highlight the effect of the cloud on the accounting industry.

Our approach is specifically focused on the monetary benefits and also the other benefits cloud has to offer. Accounting is one of the traditional sectors next in line for an automation upgrade. This means practice owners will have more options to automate some of their basic accounting tasks freeing them up to focus on their customers. They will be more than happy to eliminate the administrative and manual data entry tasks which will help them to focus on their clients and building their business.

Earlier at the beginning of the cloud era, accountants were hesitant towards it due to lack of knowledge and expertise; it was unexplored territory for them.

As the decade is progressing, the paradigm is shifting, more and more accounting firms and small businesses are moving towards the cloud for more productivity and efficiency. Now accountants are looking at a great opportunity of creating new roles and we know that the change is happening and it is happening fast. They are also not afraid of using automation techniques to complete their tasks.

With this advancement in technology, accountants should adapt the idea of the cloud and get associated benefits; otherwise, there is a big risk of being left behind in this progressing cloud era. The impact of cloud computing is presently undisputed and will be responsible for providing future ideas.

There are multiple benefits for the whole financial & accounting sector, once accountants assess the results of the changes by opting for the cloud in respect to the whole accounting system.

For a small business, traditional accounting applications are expensive and complicated as well as they need internet, local IT setup, and storage capabilities to function, along with installing, configuring, and updating the software.

Now, on the contrary, performing accounting operations on the cloud becomes way easier with increased collaboration, anywhere access, scalable systems. Cloud ensures that all the users have access to the same information from any location; the only prerequisite is a stable internet connection.

As everything is happening on the cloud, there is no need for setting up local servers or hardware resources, also, most of the cloud companies follow the pay-as-you-go model which means instead of paying a large sum upfront, you can simply pay as per your usage on a monthly basis.

You can access your information anytime from any location without any geographical constraints, and that is from multiple devices. This is very crucial for making decisions as you get access to your financial information in real-time.

With fast data transfer & real-time data access, the users are more in sync which results in higher productivity and increased team collaboration. Also, the resources can be scaled anytime as per your business requirements.

On the cloud, your data is backed up on daily basis on multiple levels; in case of accidental data deletion or loss, the data can be restored at any time from backup copies. Your data get stored in onsite & offsite locations to ensure in any hazard, the data can be restored to ensure business continuity.

Securing your data is the topmost priority for clod hosting providers as they understand the value of your data. Enterprise-grade security tools have been used to protect the cloud network, many providers use 256 bit SSL encryption to encrypt the data to add an additional level of security.

Cloud comes in different variants and offers a different kinds of solutions, organizations should consider all the characteristics of their business before choosing a suitable cloud hosting service. If the right solution is chosen; it will definitely enhance the capabilities.

Data migration is one of the other very important aspects that need to consider before moving to the cloud; the amount of data that needs to be transferred and the best time to perform the migration are key points for a successful and smooth transition.

There are other financial and pricing aspects involved too; paying attention to the costs that come along with a cloud-based system. The benefits cloud offers have a price as well, so a careful evaluation is required to ensure the pricing does not exceed the current accounting model.

Being an accountant & financial advisor requires real-time access to the data provided by remote technology. Not only can it assist in creating and tracking progress toward prospective clients, but also gives 24/7 access to that information your clients can use to make immediate decisions. The more business-friendly touchpoints you have with your clients to monitor their businesss success, the more value you create for that relationship.

View original post here:
How does cloud computing transforming the accounting industry? - HostReview.com

Read More..

Most HPC data centers will deploy quantum computing within the next few years – TechRadar

Most high-performance computing (HPC) data center operators expect to deploy quantum computing solutions within the next couple of years, a new report from Atos and IQM has found.

The firms polled 110 key decision-makers from HPC centers worldwide and found that getting optimal performance out of HPC, while ensuring security and resilience, is getting more challenging for users.

To tackle the problem, 76% plan on using quantum computing by 2023. Furthermore, 71% plan to move to on-premises quantum computing by 2026.

In fact, quantum computing is the number one technology in Europe and the top three worldwide. Three-quarters (76%) of HPC centers are already using quantum computers for their servers, or have plans to do so within the next two years. They expect quantum computers to solve supply chain logistics challenges, as well as those related to climate change. They also expect it to solve existing problems faster, and reduce overall computing costs.

Furthermore, the top use cases for HPC centers are database searching, investment risk analysis, molecular modeling, and asset management.

Being able to mix standard elements with custom-developed infrastructure components is what makes cloud an essential part of the HPC architecture, the report further found. Hybrid and cloud deployments are of high priority all over the world, but very little is known about how quantum will work side-by-side with classical HPC infrastructure.

This will result, the report concludes, in the growth of outsourcing operations and maintenance in quantum computing.

Unlike classic computers, whose bits (basic unit of information) can only have two states (either 0 or 1), quantum computers qubits can take advantage of the collective properties of quantum states (superposition, interference, entanglement) when performing calculations. That should make them infinitely faster than traditional computers, but at the moment, they are only capable of solving specific computational problems.

Make sure to check out our list of the best dedicated server hosting providers right now

Read the rest here:
Most HPC data centers will deploy quantum computing within the next few years - TechRadar

Read More..

FBI spams thousands with fake infosec advice after ‘software misconfiguration’ – The Register

The FBI has admitted that a software misconfiguration let parties unknown send legit-looking email from its servers.

A statement from the bureau, dated November 14, states the agency "is aware of a software misconfiguration that temporarily allowed an actor to leverage the Law Enforcement Enterprise Portal (LEEP) to send fake emails."

Spam-tracking service Spamhaus tweeted about the incident on November 13.

The mails contained a warning that FBI monitoring had detected "exfiltration of several of your virtualized clusters in a sophisticated chain attack" perpetrated by a chap named Vinny Troia, the founder of infosec firms named Shadow Byte Cyber and Night Lion Security.

There is no indication Troia had anything to do with the incident and The Register makes no suggestion he was in any way involved. However, an entity using the name and Twitter handle "@pompompur_in" appears to have told Krebs on Security they were behind the incident.

"I could've 1000% used this to send more legit looking emails, trick companies into handing over data etc.," Pompompurin told Krebs. "And this would've never been found by anyone who would responsibly disclose, due to the notice the feds have on their website."

Troia also appears to have attributed the incident to @pompompur_in.

For what it's worth, @pompompur_in's Twitter profile states it also operates a private account on the service with the handle @seds. The profile for that account reads: "Call me vinny troia the way I be selling DBs." Other @pompompur_in posts suggest bad blood between whoever operates the account and Troia.

Whoever was behind the attack, the FBI has admitted it was real and that a server it operates was used to send the mails. Another Spamhaus Tweet suggests that whoever got in was able to use the FBI server to send two spurts of mail, with around 100,000 messages making it out.

The server in question was part of LEEP, which the FBI describes as "a secure platform for law enforcement agencies, intelligence groups, and criminal justice entities [that] provides web-based investigative tools and analytical resources" for other law enforcement agencies.

"Users collaborate in a secure environment, use tools to strengthen their cases, and share departmental documents." Or at least that's what they do when they're not trying to figure out what "exfiltration of several of your virtualized clusters in a sophisticated chain attack" means.

But we digress.

The FBI explains that the server was "dedicated to pushing notifications for LEEP and was not part of the FBI's corporate email service", and that no data or personally identifiable information was accessed.

"Once we learned of the incident, we quickly remediated the software vulnerability, warned partners to disregard the fake emails, and confirmed the integrity of our networks."

Unusually, the FBI's posts don't mention an investigation into the incident. Perhaps the Bureau's waiting for the weekend to end before trying to track down @pompompur_in.

More here:
FBI spams thousands with fake infosec advice after 'software misconfiguration' - The Register

Read More..

Friday FOSS fest: Franz, RamBox, Pidgin and more – The Register

Most modern chat systems are entirely proprietary: proprietary clients, talking proprietary protocols to proprietary servers. There's no need for this: there are free open standards for one-to-one and one-to-many comms for precisely this sort of system, and some venerable clients are still a lot more capable than you might remember.

But as it is today, if you need to be on more than one chat system at once, the official way is to install their client app, meaning multiple clients or at best, multiple tabs open in your web browsers. Most of these "clients" are JavaScript web apps anyway, running inside Electron an embedded Chromium-based single-site browser. Which is fine, but Chrome is famously memory-hungry.

There is a brute-force way round this: have one app that embeds lots of separate Electron instances in tabs. There are a few of these around first came RamBox, followed by Franz. Both use the "freemium" model: there's a completely functional free client, plus subscriptions for extra features. If you prefer to avoid such things, both services have no-cost forks: Ferdi from Franz and Hamsket from RamBox. A newer rival still is Station.

They're not perfect but these messaging-aggregators are very handy: you get all your messaging apps in a single client, with a single set of notifications, and they're separate from your browsers. You can configure multiple accounts on each service, which can be tricky in a browser if it stores your credentials. The snags are that the UI inside each tab is totally different, and they are very memory-hungry: each tab takes hundreds of megs of RAM, and if you have a lot of tabs, the parent app can easily snarf a couple of gigs. That's the price of building apps in Javascript.

But there is another way. If you were online in the 1990s, you may recall the early days of online chat, with multiple proprietary "instant messengers": AIM, Yahoo, MSN and so on. Most of them have been shut down now, although the oldest of all, ICQ, was spun off by AOL and is still around. Some of the clients could connect to rival services, leading to decidedly hairy hacks to validate that clients were genuine, such as AOL intentionally exploiting a buffer overflow in its own code. This didn't stop third parties creating their own clients, such as the Linux client GAIM.

Back in 1999, a group came together to create a free open standard for person-to-person messaging: Jabber, later renamed the Extensible Messaging and Presence Protocol or XMPP. For a while, it was widespread, including large corporations such as Facebook and Google, although most have removed support for it now.

The original purpose of GAIM went away, but the app did not. The team added support for other operating systems and protocols and renamed it to Pidgin. It's still very much alive. It runs on Windows as well as Linux, and not only does it work with any XMPP service, such as Cisco's, but also Apple's Bonjour, Google Talk, Groupwise, ICQ, IRC, SIMPLE, and Zephyr.

There are also plugins available for dozens more Telegram, Facebook Messenger, both personal and business versions of Skype, Discord, Mattermost, QQ, Rocket.chat, Twitter, Slack, Steam, Threema, WeChat, and more. It may not talk to every chat service out there, but it supports most of them.

There is a drawback with multiprotocol clients like this, though, be they tabbed web-apps or true native client-server setups you need to configure all the protocols you will use in each client. A newer protocol hopes to tackle that problem: Matrix. Matrix can do point-to-point conversations, as XMPP does, but also channels and chatrooms and more importantly, it can link to other services via server-side bridges. A Matrix client the reference one is Element can bring multiple messaging services together into a single inbox in a single local app, connected via a single login.

Matrix can be tricky to configure, though, so some companies are offering paid-for messenger-unification services running on top of the Matrix protocol. Beeper is a commercial effort and includes Apple iMessage via a hilarious workaround, whereas the cheaper Element One is basically a hosted version of Matrix. A new mystery contender is Texts, which is closed-source for now although the company says it will open-source its SDK later.

There never was a golden age of any-to-any chat systems, but fifteen years ago, things were a lot better than they are now. There is reason to hope, though. There are signs that things are getting better.

Read the original here:
Friday FOSS fest: Franz, RamBox, Pidgin and more - The Register

Read More..

Dynatrace : Automatic connection of logs and traces accelerates AI-driven cloud analytics – marketscreener.com

As digital transformation continues to accelerate and enterprises modernize with the adoption of cloud-native architectures, the number of interconnected components and microservices is exploding. Logs are a critical ingredient in managing and optimizing these application environments. Dynatrace now unifies log monitoring with its patented PurePath technology for distributed tracing and code-level analysis. Logs are now automatically connected to distributed traces for faster analysis and optimization of cloud-native and hybrid applications.

Customers expect enterprises to deliver increasingly better, faster, and more reliable digital experiences. Cloud-native observability is a prerequisite for companies that need to meet these expectations. Observability enables a holistic approach to automation and BizDevOps collaboration for the optimization of applications and business outcomes.

Logs are a crucial component in the mix that help BizDevOps teams understand the full story of what's happening in a system. Logs include critical information that can't be found elsewhere, like details on transactions, processes, users, and environment changes.

A key element of effectively leveraging observability is analyzing telemetry data in context. Being able to cut through the noise, with all the relevant logs at hand, dramatically reduces the time it takes to get actionable insights into the optimization and troubleshooting of workloads.

Without automation, this contextualization is hardly feasible, especially in large and dynamic environments. Modern heterogeneous stacks consist of countless interconnected and ephemeral components and microservices. Log entries related to individual transactions can be spread across multiple microservices or serverless workloads. Manual and configuration-heavy approaches to putting telemetry data into context and connecting metrics, traces, and logs simply don't scale.

With PurePath distributed tracing and analysis technology at the code level, Dynatrace already provides the deepest possible insights into every transaction. Starting with user interactions, PurePath technology automatically collects all code execution details, executed database statements, critical transaction-based metrics, and topology information end-to-end.

By unifying log analytics with PurePath tracing, Dynatrace is now able to automatically connect monitored logs with PurePath distributed traces. This provides a holistic view, advanced analytics, and AI-powered answers for cloud optimization and troubleshooting.

Automatic contextualization of log data works out-of-the-box for popular languages like Java, .NET, Node.js, Go, and PHP, as well as for NGiNX and Apache Web servers. Unlike other approaches in the market, Dynatrace allows you to apply this new functionality broadly via central activation. This automated approach avoids any manual configuration of tracers or agents and the need to restart processes.

In addition, Dynatrace offers an open-source approach to the contextualization of log entries and distributed traces as well via OpenTelemetry.

You can instantly investigate logs related to individual transactions on the new Logs tab in the PurePath view. This instantly reveals additional context.

The example below includes analysis of a payment issue. It shows that a call to the payment provider was declined because the credit card verification failed. From the call perspective, you can easily see the related entry in the code-level information to understand where in your code this specific log entry was really created.

Dynatrace makes it easy to view the log lines related to individual spans or a broader view that covers transactions end-to-end or even entire workloads.

Uniquely, Dynatrace also provides connections to the processes that handle each call.

In the screenshot above, you can see that a single trace created 64 different log entries. The top entry, marked with status ERROR is critical to the analysis of this issue.

This seamless user journey is also available from the log viewer side. You can easily get from individual log lines to a transaction-centric view for additional context and analysis.

Starting with OneAgent version 1.231, you can activate our OneAgent code modules for Java, .Net, Go, Node.js, PHP, NGiNX, or Apache Web server to automatically enrich logs with trace context without any manual code or configuration change on your workload. Just go to Settings > Server-side service monitoring > Deep Monitoring > New OneAgent features. This ensures that trace IDs are automatically added to log lines for transaction-based analytics.

Structured log entries that are ingested via the Generic log ingestion of Log Monitoring V2 will show up in related PurePath traces starting with Dynatrace version 1.232.

Within the next 90 days, all transaction-related logs will show up in PurePath view after activation of this new functionality.

To find out more, see Dynatrace Log Monitoring documentation and PurePath distributed tracing documentation.

If so, start your free trial today!

Disclaimer

Dynatrace Inc. published this content on 18 November 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 18 November 2021 13:02:03 UTC.

Follow this link:
Dynatrace : Automatic connection of logs and traces accelerates AI-driven cloud analytics - marketscreener.com

Read More..

EU digital sovereignty project Gaia-X opens its summit with the departure of Scaleway – The Register

French cloud hosting outfit Scaleway is to depart the EU's data sovereignty project, Gaia-X, with CEO Yann Lechelle worrying that what began with splendid ideals is getting increasingly mired in the status quo.

Scaleway's announcement came as the second Gaia-X summit got under way, titled "Here to deliver", and amid rumblings from members over the sponsorship of the gathering by the likes of Huawei and Alibaba as well as the involvement of US cloud giants such as Microsoft and AWS.

"Scaleway will not renew its GAIA-X membership," the company said. "The objectives of the Association, while initially laudable, are being sidetracked and slowed down by a polarization paradox which is reinforcing the status quo, that is an unbalanced playing field."

We did not intend to create a mass movement away from Gaia-X... I actually believed in it, even though the foundation was asymmetrical because of German interests, working with US players, mostly, and French players, who were also cloud providers

Scaleway was one of the founders of Gaia-X, and its departure will doubtless cause more than a little discussion behind the scenes during this week's event.

"It's been brewing for many months," Scaleway CEO Yann Lechelle told The Register. Of Gaia-X, he said: "The idea was to create a sort of improvement in terms of sovereignty, right. And sovereignty is very loaded as a term.

"Does anyone question the sovereignty of Microsoft in the US?" he asked, rhetorically. "Nobody does what we do know is that we do not have sovereignty when it comes to tech, and that is the main issue."

Lechelle went on to highlight differences between the German and French approaches, noting that a preference by some countries to work with the tech giants ("itself, not a problem," he added) "reinforces the dependence and the dependencies on extraterritorial tech monopolies."

"Gaia-X," he said, "evolved in terms of governance. I fought very hard that the governance remained strictly European. But then, the German counterparts and even the French big players not cloud providers, but big consumers of cloud wanted to keep working with the big tech players.

"And so at the end of the day, the reason why we're leaving is that Gaia-X as a construct is only reinforcing the status quo, which is that dominating players will keep dominating."

As the news broke, Sid Nag, a VP analyst at Gartner, told The Register: "My feeling is that players like Scaleway might be saying 'You know what, is it worthwhile for us to continue to invest in this? For a second or third-tier cloud provider it involves a significant investment of time and money and energy. They're probably saying, 'I'm not seeing the benefit of doing this any more.'

"The Gaia-X initiative was about creating a sovereign cloud of sorts for the Eurozone. But there's this secondary motivation of competing with the hyperscalers based in North America.

"If there was a way for North American providers to operate on Europe-based data for their European clients in a sovereign manner then I don't think this would even be a conversation today."

On the latter point, Lechelle noted efforts by the French government to, as he put it, "move away from the dusty mainframes and move to the public cloud."

"Except, at the same time, there's the CLOUD Act. And the CLOUD Act gives the US way too much access worldwide. So we talk about extraterritorial laws. And the EU is weak in that sense, because we do not have reciprocity."

As Microsoft continues to push Office customers to the cloud, European governments have found themselves in somewhat of a quandary when it comes to data. "So what they're saying in a way is like, OK, here is the deal: Microsoft, you need to do a joint venture with French players, create a packaged version of Azure, and Office 365. It will be managed by French operators, and therefore it will be disjointed from the US cloud and therefore not subject to the CLOUD Act."

As for Gaia-X, Lechelle said Scaleway would "be looking in from the outside." After all, Gaia-X is an open project. "But," he added, "we don't have time for this now we need to spend our energy becoming a relevant player at scale."

Reactions to Scaleway's departure have so far been muted. Lechelle put it diplomatically, saying: "Some of the members might want to reach out to us (and they have); maybe they will disengage, maybe they will not..." or, as he suggested, some might opt to leave things until after this week's summit.

"We did not intend to create a mass movement away from Gaia-X," he told us. "I actually believed in it, even though the foundation was asymmetrical because of German interests, working with US players, mostly, and French players, who were also cloud providers."

Other members reiterated their support. An HPE spokesperson told The Register the company was "committed to the framework. We are contributing to the Gaia-X foundations and driving a number of projects with customers and partners."

Amanda Brock, CEO of Gaia-X member OpenUK, also confirmed her organisation's support and commitment. "The Gaia-X members represent the state of the art for Europe in terms of digital sovereignty and they act as the backbone for Europe's federated data model," she said.

"The UK will engage with this more fully over time, and with that in mind, we are working with a group across the UK to shape a potential Gaia-X Hub for the UK to launch in 2022."

Of Scaleway's departure, Brock said: "With 300-plus members, if we are realistic, there will inevitably be a level of dropout. I don't see anything to be surprised about in the announcements this week. At the same time as we see this natural evolution, we see a true doubling down on open from Europe."

Simon Hansford, CEO of UKCloud, took a slightly more cautious tone: "It has been puzzling to see the increasing presence of global giants such as Huawei and Alibaba on Gaia-X which we wholeheartedly support in principle. Nonetheless, this does raise questions as to whether the current setup of Gaia-X is capable of fulfilling its objective as a genuinely sovereign cloud for Europe."

The Register put Scaleway's points to Gaia-X and will update should we receive a response.

See the article here:
EU digital sovereignty project Gaia-X opens its summit with the departure of Scaleway - The Register

Read More..

Why is AI an obsession for business insiders? – just-auto.com

Jorm Sangsorn via Getty

Business leaders are still worrying about artificial intelligence (AI), but with Facebook pushing hard into the metaverse, augmented reality (AR) has also proven a massive concern for corporate chieftains.

Thats according to analytics firm GlobalData. The company defines a theme as any issue that keeps chief executives awake at night. In a thematic survey published in October 2021, GlobalData gauged the business communitys current sentiment towards emerging technologies that kept executives stirring into the early hours.

The research found that AI was the technology perceived as most disruptive in Q3 2021, regaining its position from AR, which, as Verdict previously reported, held the top spot in the previous quarter.

66% of professionals from over 30 industries stated in the poll that AI would deliver either slight or significant disruption to their industry. This was a sharp increase from the previous quarter when 49% said AI would disrupt their industry. It returns AI to the position that it held in Q4 2020 and Q1 2021.

AR has gone in the opposite direction, and now only 48% see the technology as disruptive, down from 70% in Q2 2021.

The interactive tech, which blends the physical space with digital visualisations, reached wider public knowledge through the Pokmon Go craze in 2016, but also has real business potential. AR tech, for example, is being used for remote collaboration, training, maintenance, customer support and product design.

Its also seeing large uptake in ecommerce as a utility, both for consumers and brands. Various social giants have merged AR and ecommerce into their social media platforms, offering users the ability to try on products virtually.

The buzz around AR has grown recently thanks to Facebook's expansion into the metaverse, a virtual world where users can share experiences and interact in real-time within simulated scenarios. This is made possible through AR applications and virtual reality (VR) headsets.

Facebook is banking on the digital world enough to have renamed itself as Meta in October. The recent name change came alongside the company's pivot to becoming a metaverse company instead of a social media one (and an increasingly toxic social media brand at that).

Name change or not, it seems insiders may have seen through the AR hype, according to GlobalData's polling.

"The greater variation in perceptions of AI and AR noted in Q3 2021 could be because, like cybersecurity and cloud computing, both have a wide range of applications," explains Rupantar Guha, associate project manager of Thematics at GlobalData. "However, unlike cyber and cloud, deployment of AI and AR is at an early stage."

"Regarding the metaverse, AR and VR are key technologies in this developing mega-theme. VR-based metaverses are arriving in the market (e.g., Facebook's Horizon Workrooms) and AR-based metaverses (e.g., Microsoft Mesh) are also in development.

"It is too early to say which technology will outpace the other in the short run, given that both are in nascent stages of development and the metaverse is still largely conceptual. However, AR's accessibility through web browsers, smartphones, and smart glasses (that are less expensive than VR headsets) could give it an edge over VR in the long run."

In the meantime, it is AI which business bosses are banking on as the emerging technology of choice.

Of the emerging technologies included in GlobalDatas polls every quarter, perceptions of AI and AR are the most volatile. In the companys view, this is because confidence in the disruptive potential of the two technologies is fragile and likely to continue to experience variation as more companies implement them.

The majority of the respondent pool said that they felt more positive towards emerging technologies in Q3 2021 than a year ago. At least half of all those polled said they were more positive towards four of the seven technologies that GlobalData enquired about: cybersecurity, AR, AI and 5G.

AR and AI were behind only cybersecurity in positive sentiment change. This indicates that, despite the volatile perceptions of the two technologies regarding the level of disruption they can bring, a majority still felt more favourable than in 2020.

55% of respondents said that AI would live up to all its promises, which is only a small drop from the 57% who said the same in the previous quarter. The continued good performance of AI in this indicator suggests that businesses are hopeful that the technology will ultimately deliver significant benefits.

GlobalData predicts the global AI platform market will be worth $52bn within three years. The burgeoning theme is driven by the obvious business benefits of AI.

The tech allows businesses to accelerate digital innovation and development, resulting in increased efficiency, lower operational costs, higher revenues and improved customer experience.

Enterprise AI projects often share three main objectives. One is the automation of business processes as automating routine day-to-day activities and obligations contributes to more efficient use of labour, with workers able to focus their time and energy on higher-value tasks. AI lets businesses also reduce operating costs and cut out errors that come part and parcel with routine processes and tasks in the workplace.

AI can also provide business insights that make sense of vast amounts of data to predict customer preferences and generate high-quality sales leads. Essentially converting information into knowledge, AI can help with everything from providing personalised product recommendations to identifying credit fraud.

Finally, the tech improves customer engagement. AI-driven customer service capabilities such as virtual assistants and chatbots enable businesses to communicate with high volumes of customers every day, something which proved especially useful in pandemic when face-fronting options werent possible. The pseudo-human face of AI can provide a more personalised experience that drives growth, reduces costs, and improves retention and overall customer satisfaction.

Companies must remember, though, that delivering a successful AI project is not easy, warns GlobalData thematic research director Ed Thomas. It requires meticulous planning, detailed preparation, and complete buy-in from all parts of the business. It also requires an understanding of the problem that needs solving.

AI alone will not cure all ills, but it can successfully address specific business challenges.

Another part of AIs attraction is that the tech is, arguably, a rising tide lifting the profile of various emerging technologies.

Take the example of AR. AI technologies such as machine learning (ML), conversational platforms and AI chips power most of todays AR devices and apps.

AR developers use ML to improve the user experience (UX) by continually analysing user activities. Apples CoreML and Googles TensorFlow Lite ML frameworks support ARKit and ARCore, respectively, and allow developers to run ML models to improve object recognition in photos, convert speech to text and enable gesture recognition. Eye tracking and facial recognition, fast becoming standard functions across all AR devices, use ML to improve UX.

Virtual assistants like Amazons Alexa enable the hands-free operation of AR devices meanwhile. This is critical for some use cases, especially in enterprises: for example, doctors using Vuzixs M400 smart glasses for training and conducting patient rounds remotely during the pandemic. Equipped with hands-free voice support for Skype for Business and Zoom, the glasses help to keep human contact to a minimum with the aid of AI-powered conversation.

"Smart glasses' and headsets' heads-up displays bring visuals into the user's field of view, while voice assistants enable voice-based control of devices and apps," elaborates Guha. "The use of voice in AR devices is limited to specific tasks that add convenience to the user, helping them avoid using hands.

"Voice assistants are used as supporting capabilities such as pulling up apps to view documents and connecting with remote experts in industries such as healthcare, oil & gas, logistics, and manufacturing, among others."

Many organisations are also putting their faith in AI to improve their cybersecurity, with the tech providing cover for the continuing cybersecurity skills gap.

AI offers a more proactive defensive approach to discover and analyse the growing landscape for attacks. As GlobalData reports, the reality is that all software is at risk through human error and inadvertent security holes, which attackers can exploit.

With AI, the target is to have more comprehensive, predictive assessments of breach risk that recognise and prioritise the necessary steps to avoid breaches. The concept is enough to have led UK-US cybersec brand Darktrace to make its name and go public. Its Cyber AI Analyst aims to emulate human thought processes and automate tasks to continuously investigate cyber threats at machine speeds. The proprietary software is said to reduce the average time to investigate threats by 92%.

Darktrace was founded by Cambridge University mathematicians and US-UK government cyber intelligence experts, backed by infamous Autonomy founder Mike Lynch once dubbed Britains Bill Gates, today a wanted man facing extradition proceedings from the US over the sale of Autonomy to HP in 2011. Darktraces original AI tech, the Enterprise Immune System, was supplemented by autonomous response technology, which allowed the system to react to in-progress cyberattacks.

It should be noted though that Darktrace shares recently plunged by 23% over what's seen as a gap "between promise and reality" regarding its products: and the legacy of Lynch and Autonomy hangs over the company. Several executives at Darktrace, including its founder and CEO Poppy Gustafsson, have previously held roles at Autonomy and at Invoke Capital, Lynch's VC fund.

There is also the concept of AIoT, an amalgamation of AI and the Internet of Things (IoT). AIoT involves embedding AI technology into IoT components; combining data collected by connected sensors and actuators with AI allows for reduced latency, increased privacy and real-time intelligence at the edge. It also means that less data needs to be sent to, and stored on, cloud servers.

Apple is one name investing in this nascent field; in January 2020, the tech giant acquired Xnor.ai, which offered AI-enabled image recognition tools capable of functioning on low-power devices.

Being a rising tide lifting all boats and even profit margins, its no wonder AI is such an obsession for the global business community.

GlobalData polls were conducted online between the first week of July and the fourth week of September 2021, and ran on Verdict, GlobalDatas network of B2B websites. In total, the polls received 2,128 responses distributed unequally between each of the polls.

This article is part of a special series by GlobalData Media on artificial intelligence. Other articles in this series include:

See more here:
Why is AI an obsession for business insiders? - just-auto.com

Read More..

Chess Corner: It’s good to be kind 11.18.21 – Muskogee Daily Phoenix

Black to move and win

DIAG 1

This weeks chess problem is a lesson in giving. Often when we give, there is a butterfly effect and we are better off than we were before. With this hint in mind please try to find blacks winning tactical assault.

The position seems roughly equal, but the blacks pressure along the d file is decisive. Black first makes an exchange sacrifice with the rook on d8 by capturing whites bishop on d2. White recaptures the rook with its knight on f3.

Black next slides its other rook over to d8 (see next diagram).

DIAG 2

Because of the bishop pin on whites d2 knight and the added pressure from the rook, white is sure to lose a piece.

In chess news, the French-Iranian 18-year-old wonderkid Alireza Firouzja won the FIDE Chess.com Grand Swiss on Sunday in Latvia, as American Fabiano Caruana placed second. This qualified Firouzia and Caruana for two of the eight spots for the 2022 Candidates Tournament, which determines the challenger to the World Chess Champion.

Reach Eric Morrow at ericmorrowlaw@gmail.com or (505) 327-7121.

We are making critical coverage of the coronavirus available for free. Please consider subscribing so we can continue to bring you the latest news and information on this developing story.

Continued here:
Chess Corner: It's good to be kind 11.18.21 - Muskogee Daily Phoenix

Read More..