Heard on the Street 1/24/2022 – insideBIGDATA

Welcome to insideBIGDATAs Heard on the Street round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace. We invite submissions with a focus on our favored technology topics areas: big data, data science, machine learning, AI and deep learning. Enjoy!

COVID-19: A Data Tsunami That Ushered in Unprecedented Opportunities for Businesses and Data Scientists. Commentary by Thomas Hazel, founder & CTO at ChaosSearch

From creating volatile data resources to negatively impacting forecasting models, there have been countless challenges the pandemic has caused for organizations that rely on data to inform business decisions. However, there is also an upside to the data tsunami that COVID-19 created. The movement to all-things-digital translated into a tsunami of log data streaming from these digital systems. All this data presented an incredible opportunity for companies to deeply understand their customers and then tailor customer and product experiences. However, theyd need the right tools and processes in place to avoid being overwhelmed by the volume of data. The impact spans all industries, from retail to insurance to education.Blackboard is a perfect example. The world-leading EdTech provider was initially challenged at the start of the pandemic with the surge of daily log volumes from students and school systems that moved online seemingly overnight. The company quickly realized they needed a way to efficiently analyze log data for real-time alerts and troubleshooting, as well as a method to access long-term data for compliance purposes. To accomplish this, Blackboard leverages its data lake to monitor cloud deployments, troubleshoot application issues, maximize uptime, and deliver on data integrity and governance for highly sensitive education data. This use case demonstrates just how important data has become to organizations that rely on digital infrastructure and how a strong data platform is a must to reduce the time, cost, and complexity of extracting insights from data. While the pandemic created this initial data tsunami, tech-driven organizations that have evolved to capitalize on its benefits, like Blackboard, have accepted that this wave of data is now a constant force that they will have to manage more effectively for the foreseeable future.

Cloud Tagging Best Practices. Commentary by Keith Neilson, Technical Evangelist at CloudSphere

While digital transformation has been on many organizations priority list for years, the Covid-19 pandemic applied more pressure and urgency to move this forward. Through their modernization efforts, companies have unfortunately wasted time and resources on unsuccessful data deployments, ultimately jeopardizing company security. For optimal cyber asset management, consider the following cloud tagging best practices:Take an algorithmic approach to tagging. While tags can represent simple attributes of an asset (like region, department, or owner), they can also assign policies to the asset. This way, assets can be effectively governed, even on a dynamic and elastic platform. Next, optimize tagging for automation and scalability. Proper tagging will allow for vigorous infrastructure provisioning for IT financial management, greater scalability and automated reporting for better security. Finally, be sure to implement consistent cloud tagging processes and parameters within your organization. Designate a representative to enforce certain tagging formulas, retroactively tag when IT personnel may have added assets or functions that they didnt think to tag and reevaluate business outputs to ensure tags are effective.While many underestimate just how powerful cloud tagging can be, the companies embracing this practice will ultimately experience better data organization, security, governance and system performance.

Using AI to improve the supply chain.Commentary by Melisa Tokmak, GM of Document AI, Scale AI

As supply chain delays continue to threaten businesses at the beginning of 2022, AI can be a crucial tool for logistics companies to speed up their supply chain as the pandemic persists. Logistics and freight forwarding companies are required to process dozens of documents such as bills of lading, commercial invoices and arrival notices fast, and with the utmost accuracy, in order to report data to Customs, understand changing delivery timelines, collect & analyze data about moving goods to paint information about the global trade. For already overtaxed and paperwork-heavy systems, manual processing and human error are some of the most common points of failure, which exacerbate shipping delays and result in late cargo, delayed cash flow & hefty fines.As logistics companies have a wealth of information buried in the documents they process, updating databases with this information is necessary to make supply chains more predictable globally. Most companies spend valuable time analyzing inconsistent data or navigating OCR and template-based solutions, which arent effective due to the high variability of data in these documents. Machine learning-based, end-to-end document processing solutions, such as Scale AIs Document AI, dont rely on templates and can automate this process; AI solutions allow logistics companies to leverage the latest industry research without changing their developer environment. This way, companies can focus on using their data to cater to customers and serve the entire logistics industry, rather than spending valuable time and resources on data-mining.ML-based solutions can extract the most valuable information accurately in seconds, accelerating internal operations, reducing the number of times containers are opened for checksdecreasing costs and shipping delays significantly. Using Scales Document AI, freight forwarding leader Flexport achieved significant cost savings in operations and decreased the processing time of each document. Flexports documents were formerly processed in over two days, but with Document AI, were processed in less than 60 seconds with 95%+ accuracy, all without having to build and maintain a team of machine learning engineers and data scientists. As COVID has led to a breakdown of internal processes, AI-powered document processing solutions are helping build systems back up: optimizing operations to handle any logistic needs that come their way at such a crucial time.

IBM to Sell Watson Health. Paddy Padmanabhan, Founder and CEO of Damo Consulting

IBMs decision to sell the Watson Health assets is not an indictment of the promise of AI in healthcare. Our research indicates AI was one of the top technology investments for health systems in 2021. Sure, there are challenges such as data quality and bias in the application of AI in the healthcare context but by and large there has been progress with AI in healthcare. The emergence of other players, notably Google with its Mayo Partnership, or Microsoft with its partnership with healthcare industry consortium Truveta are strong indicators of progress.

Data Privacy Day 2022 Commentary. Commentary by Lewis Carr, Senior Director, Product Marketing at Actian

In 2022, expect to see all personal information and data sharing options get more granular as to how we control them both on our devices and in the cloud specific to each company, school or government agency. Well also start to get some visibility into and control over how our data is shared between organizations without us involved. Companies and public sector organizations will begin to pivot away from the binary options (opt-in or opt-out) tied to a lengthy legal letter that no one will read and will instead provide the data management and cybersecurity platforms with granular permission to parts of your personal data, such as where its stored, for how long, and under what circumstances it can be used. You can also expect new service companies to sprout up that will offer intermediary support to monitor and manage your data privacy across.

Data Privacy Day 2022 Commentary. Commentary by Rob Price, Principal Expert Solution Consultant at Snow Software

The adoption of cloud technology has been a critical component to how we approach privacy and data protection today. A common misconception is that if your data is offsite or cloud-based its not your problem but that is not true because the cloud is not a data management system. Two fundamental factors for data protection and security are the recovery point objective (how old can data be when you recover it) and the recovery time objective (how quickly can you recover the data). Every companys needs are different, but these two factors are important when planning for data loss.

Sign up for the free insideBIGDATAnewsletter.

Join us on Twitter:@InsideBigData1 https://twitter.com/InsideBigData1

Read more here:
Heard on the Street 1/24/2022 - insideBIGDATA

Related Posts

Comments are closed.