Page 3,351«..1020..3,3503,3513,3523,353..3,3603,370..»

Rural Cloud Initiative Completes First Phase of the Farm of the Future – PRNewswire

BOULDER, Colo., Oct. 5, 2020 /PRNewswire/ --Trilogy Networks and the Rural Cloud Initiative (RCI) announced today that the RCI has completed Phase 1 of its first Farm of the Future deployment at Hurst Greenery in Westboro, MO. This first phase announced on June 30 and completed on September 30 consists of the Trilogy nationwide LinX Network and the ConEx regional edge cloud platform hosting real-time Internet of Things (IoT) services delivered over a private LTE network to the Hurst Greenery. This pilot proves out the Trilogy architecture and, as it moves into production later this year, will power a variety of advanced precision agriculture applications, including sensors, monitoring devices, satellite mapping solutions, drones and even robots. Trilogy Networks integrated technologies from multiple best-in-class network and solution providers including Celona, Chat Mobility, ClearBlade, FMTC Data Center, IAMO Communications, Lanner Electronics, Midwest Data Center, Pluribus Networks, and SBA Edge Data Center, to create a custom solution specific to Hurst Greenery's needs.

"This first phase of the Farm of the Future project will result in a 10 percent increase in efficiency and profit, allowing for increases in yields and cost savings," said Blake Hurst, owner of Hurst Greenery, president of the Missouri Farm Bureau, and vice president of the FCC Precision Connectivity Working Group. "Today's announcement is an exciting first step toward deploying advanced technologies that will help Hurst Greenery and more than 2,000 other area farms streamline their operations and maximize their productivity."

Those applications deployed at the edge of the network will give Hurst Greenery as well as potentially another 2,300 farms in a two-state area the ability to increase their operational efficiency, reduce operating costs, and use increased automation to monitor and control crops and environmental systems.This Farm of the Future showcases the potential of a new way to deliver advanced solutions to rural markets. Edge-as-a-service, or EaaS, will give farms and other rural industries the capability to deploy advanced technology solutions quickly, seamlessly, and cost effectively without having to invest in expensive infrastructure.

In addition to building the private LTE network, Phase 1 of the Farm of the Future project included deploying a platform interface to connect an array of IoT applications, establishing the LinX network for long-haul connectivity, and providing distributed cloud capability via the ConEx edge cloud platform. Trilogy Networks collaborated with nine other Rural Cloud Initiative members to complete this initial phase:

"This project demonstrates the ability of RCI members to collaborate for a successful outcome, a core mission of the Rural Cloud Initiative," said Brian Spurgeon, general manager at Chat Mobility and newly elected chairman of the RCI Advisory Council for Rural Edge Solutions (ACRES).

The first application deployed on the private LTE network is already providing real-time environmental data for temperature and humidity over approximately two acres of greenhouses and another several acres of farmland, allowing Hurst Greenery to manage fans, heaters, and other environmental equipment, and to receive immediate alarm alerts when an issue arises.

"Completing this first phase of our Farm of the Future project was an immense undertaking requiring the close collaboration of multiple carrier and edge innovation partners," said George Woodward, president and CEO of Trilogy Networks. "To do it in just under three months is nothing short of amazing. Our ability to build the network and bring the platform and applications online so quickly and to already be seeing positive results shows the enormous business value edge connectivity can bring to America's farms."

About the Rural Cloud InitiativeThe Rural Cloud Initiative is a unique coalition of more than 45 network and edge innovation partners committed to promoting and accelerating the digital transformation of rural America. The RCI partners are working together to deploy edge solutions running on a unified, distributed cloud covering an area of 1.5 million square miles of rural America, providing the essential infrastructure for 5G, agriculture, and energy solutions.

For more information on the Rural Cloud Initiative, visit: https://ruralcloud.com/

For more information on Trilogy Networks, visit https://trilogynet.com/

Media Contact for Trilogy:John O'Malley585-261-5899[emailprotected]

SOURCE Rural Cloud Initiative

https://ruralcloud.com

Here is the original post:
Rural Cloud Initiative Completes First Phase of the Farm of the Future - PRNewswire

Read More..

Oklahoma Hospital Selects Evident EHR and TruBridge RCM Solutions for Their Integration, Intuitive Workflows, and Financial Efficiencies – Business…

MOBILE, Ala.--(BUSINESS WIRE)--Evident, LLC, a wholly owned subsidiary of CPSI (NASDAQ: CPSI) and a leading provider of electronic health record (EHR) systems and services, announced today that Haskell Regional Hospital, Inc., an 18-bed critical access hospital (CAH) located in Stigler, Oklahoma, has selected the Evident EHR solution through a Software-as-a-Service (SaaS) licensing agreement. This includes the full suite of clinical, financial and workforce management applications and the Revenue Cycle Management (RCM) and cloud-hosting services offered through CPSI sister company, TruBridge, LLC.

Boa Vida Healthcare manages Haskell Regional Hospital, Inc., the second of its hospitals to implement Evidents EHR solution, following Monroe Regional Hospital located in Aberdeen, Mississippi. Like many rural facilities, Haskell Regional Hospital, Inc. faced challenges with profitability and risk of closure, which were further affected by the COVID-19 pandemic. Running the Evident EHR solution will afford Haskell Regional Hospital, Inc. the opportunity to take advantage of advanced, modern and integrated technology solutions to improve workflow efficiencies for providers and facilitate the transfer of patient information across a variety of settings, including emergency department, lab, patient rehab and inpatient care.

Haskell Regional Hospital, Inc. is the first to implement the Evident EHR solution under our new corporate agreement with Evident, said KJ (Kirnjot) Singh, MD, president of Boa Vida Healthcare. We believe the benefits from the Evident EHR solution, plus the complementary offerings from TruBridge, have already delivered significant value to Monroe Regional Hospital and will do the same for Haskell Regional Hospital, Inc. and many more of our facilities in the future. The competition simply cant compete with what Evident has to offer community hospitals. Its experience and longstanding focus on rural healthcare delivery really sets Evident apart.

The TruBridge RCM solution will create efficiencies and drive revenue cycle success for both front and back office staff at Haskell Regional Hospital, Inc. and future facilities due to the corporate agreement. Through integration with the Evident EHR solution, our teams will be able to easily reconcile charges and eliminate duplicate entries, making our patient billing process more accurate and having a positive impact on profitability, which is key to solid business operations, Singh added.

According to Boyd Douglas, president and chief executive officer of CPSI, The combination of our industry leading EHR and RCM solutions continues to bring significant value to our clients and their ability to improve both patient and financial outcomes for the communities they serve. In partnership with Boa Vida Healthcare, we look forward to bringing this winning combination to other rural communities across the U.S.

Haskell Regional Hospital, Inc. is expected to be live on the Evident system in November 2020.

About Evident

Evident, a member of the CPSI family of companies, recognizes the challenges hospitals, clinics and other healthcare providers face the need for simplicity, cost containment and delivery of a quality healthcare experience for patients and physicians alike. Our integrated software solutions are backed by a proactive support approach, making us the partner of choice for hundreds of healthcare organizations. For more information, visit http://www.evident.com.

About TruBridge

TruBridge, a member of the CPSI family of companies, provides business and consulting services, and an end-to-end Revenue Cycle Management (RCM) solution. With our arsenal of RCM offerings that include a HFMA Peer Reviewed product and a HMFA Peer Reviewed complete outsourcing service, TruBridge helps hospitals, physician clinics, and skilled nursing organizations of all sizes become more efficient at serving their communities. For further information visit http://www.trubridge.com.

About Boa Vida Healthcare

Boa Vida Healthcare brings together a team of professionals with decades of experience in clinical medicine and the management of hospitals. Boa Vida Healthcares mission is to save and improve hospitals so that they can deliver compassionate, quality care to patients and revitalize communities. Boa Vida Healthcare focuses on the unique needs of rural and urban safety-net hospitals.

Forward-Looking Statements

This press release contains forward-looking statements within the meaning of the safe harbor provisions of the Private Securities Litigation Reform Act of 1995. These forward-looking statements can be identified generally by the use of forward-looking terminology and words such as expects, anticipates, estimates, believes, projects, targets, predicts, intends, plans, potential, may, continue, should, will and words of comparable meaning. Without limiting the generality of the preceding statement, all statements in this press release relating to the ability of Evident and TruBridge to successfully partner with Boa Vida Healthcare and Haskell Regional Hospital, Inc. are forward-looking statements. We caution investors that any such forward-looking statements are only predictions and are not guarantees of future performance. Certain risks, uncertainties and other factors may cause actual results to differ materially from those projected in the forward-looking statements. Such factors may include: risks related to the ability of Evidents EHR solution to improve workflow efficiencies for providers and facilitate the transfer of patient information across a variety of settings and the ability of TruBridges RCM solution to create efficiencies and drive revenue cycle success; the ability of CPSI to enable Haskell Regional Hospital, Inc. to go live on the Evident system in November 2020; the impact of COVID-19 and related economic disruptions which have materially affected the Companys revenue and could materially affect the Companys gross margin and income, as well as the Companys financial position and/or liquidity; actions to be taken by the Company in response to the pandemic; the legal, regulatory and administrative developments that occur at the federal, state and local levels; potential disruptions, breaches, or other incidents affecting the proper operation, availability, or security of the Companys or its partners information systems, including unauthorized access to or theft of patient, business associate, or other sensitive information or inability to provide patient care because of system unavailability; changes in revenues due to declining hospital demand and deteriorating macroeconomic conditions (including increases in uninsured and underinsured patients); potential increased expenses related to labor or other expenditures; and the impact of our substantial indebtedness and the ability to refinance such indebtedness on acceptable terms or at all, as well as risks associated with disruptions in the financial markets and the business of financial institutions as the result of the COVID-19 pandemic which could impact us from a financial perspective. Numerous other risks, uncertainties and other factors may cause actual results to differ materially from those expressed in any forward-looking statements. Such factors include risk factors described from time to time in CPSIs public releases and reports filed with the Securities and Exchange Commission, including but not limited to, CPSIs most recent Annual Report on Form 10-K and Quarterly Reports on Form 10-Q. We also caution investors that the forward-looking information described herein represents CPSIs outlook only as of this date, and CPSI undertakes no obligation to update or revise any forward-looking statements to reflect events or development after the date of this press release.

See the original post:
Oklahoma Hospital Selects Evident EHR and TruBridge RCM Solutions for Their Integration, Intuitive Workflows, and Financial Efficiencies - Business...

Read More..

Best Practices for Migrating SharePoint to Office 365 – The Sports Bank

Microsoft Office 365 user population has recently hit 88 million users, as noted on thewinbeta news. As the user base continues to increase, weve seen a growing interest among the clients in moving their on-premises SharePoint to Office 365. The most important part of a?SharePoint migration?project is that it involves planning for the migration itself. With all the elements involved, such as the restrictions to migration options, this can confuse the project and introduce unwanted risks. It is important to sensibly plan a SharePoint migration and fully consider all things involved in the process. Once a decision is made to move your SharePoint sites to Office 365, you will need to decide what your business actually needs. Whether your new platform will be SharePoint Online or Hybrid, references below should be helpful.

Before you migrate SharePoint to Office 365:

If you have SharePoint installed on desktop systems, run the Onramp for Office 365 Tool to help you with finding activities linked to Office 365. If you plan on using a custom domain, make sure this domain has been confirmed. Before the?SharePoint Online Tenant to Tenant Migration?starts, make sure everything is ready with the appropriate licenses, network connectivity, firewalls, and security. To use Azure AD Connect, your on-premises AD schema and forest functional level must operate at Windows Server 2003 or later. If you consider deploying ADFS, you need to use SSL certificates. Take a record of your content, information data, design, and custom solutions. Decide what to move, taking only what you will need and archiving or deleting redundant or legacy data. Select the proper migration service provider like Apps4Rent to assist you with migrating to Office 365.

Things to keep in mind as you migrate to Office 365:

Start with a trial migration by using a representative example of data to confirm the technical possibility of the migration and identify mistakes if any. Divide the whole process of migrating in batches. Diving content into batches to migrate is mainly important for bigger companies.

Things to keep in mind after migrating to Office 365:

Check and confirm the success of the migration to ensure that the organization meets the requirements of the business. This includes looking for a good network, approval, customized solutions, etcandfor each batch of the migration repeat this process. To lead a successful transition of users to the target environment, you must stop the source environment and conduct a finalsynchronization of changes. For each batch of the migration do this process every time.

Ensuring Success with Your Migration:

Migrating existing content to SharePoint Online is not easy. Ideally, companies should spend time planning, discovering, and auditing the content. To decrease risk, start with the pre-migration list and implement complete testing after each migration batch. Thats how a good service provider will support organizations with SharePoint and Office 365 upgrades and migrations.

Baffled by the complex process of migration and want professional help? Look no further than Apps4Rent!

?Apps4Rent is a cloud hosting service provider who offers quality service with the use of the latest technology at a cost that will bring tears of joy to your eyes. Use the latest Technology of renting a desktop from the cloud with Apps4Rent. To get more information on the advantages of using SharePoint on aHostedVirtualDesktopvisit our website today!

See the rest here:
Best Practices for Migrating SharePoint to Office 365 - The Sports Bank

Read More..

A-List Osthoff Resort Positions for Travel Restart; Selects Maestro PMS During Industry Pause for Guest-Focused Mobile Services and Integrated…

The Osthoff Resort is a great example of how independent resorts can leverage this time to evaluate their systems and take advantage of the latest innovations from trusted solutions providers in preparation for the new and even more digital guest experience.

MARKHAM, Ontario (PRWEB) October 06, 2020

The Osthoff Resort is a AAA Four-Diamond stunning year-round resort, spa and event destination in Elkhart Lake, Wisc. The 238-room Resort offers its award winning Aspira Spa, the L'ecole de la Maison cooking school, its family-friendly Pleasures Program, gift shops, dining outlets, as well as a fitness center, arcade and outdoor activities venue. Adam Hartenberger, Reservations Sales Manager for The Osthoff Resort, said, It was the resorts goal to bring all its property departments and their third-party systems together on one data platform for personal guest service at every touchpoint. Maestro PMS will do this perfectly.

Our property is operated in silos for guest rooms, spa, retail, events, and guest activities. Now with Maestro we will be able to combine data and services from all our third-party systems in one Maestro Single-Image database for effortless communication and more personal guest service, Hartenberger said. Maestro brought many important factors to our decision. The first was its strength in integrated data collection and analysis. Maestros Analytics Business Intelligence system will combine all the data from our multiple operation locations for robust analysis to support more profitable management by the numbers. Maestro also allows us to self-host our system on property with a hybrid deployment of Windows, Web and Mobile applications. This will give us the flexibility of accessing the system with standard Windows terminals, mobile devices, or web-based terminals and keep our data on property. The Maestro integrated solution provides a single guest itinerary that includes all aspects of the guest experience on property including front desk and condo owner operations, spa and activities and loyalty management.

Another advantage of Maestro is its ability to communicate with our many third-party vendor systems, he said. It easily integrates with our other tech partners across our unique resort. Maestro simplifies interface deployment thanks to its Genomi open API that supports deeper capabilities for communication with other systems.

The Maestro Property Management System delivers flexible and scalable deployment options with an identical full-featured web browser or windows solution available which is hosted in the cloud or on premise to offer the best of both worlds. Maestros hotel management software applications and services centralize operations and provide personalized and touchless mobile guest service tools to enhance the guest experience while also supporting a more secure stay. In addition to implementing Maestros flexible PMS platform and multiple modules, the resort will also use Maestros Condo Owner Management to offer owners secure online access to their statements, reservation activity, and to book their own units.

Mobile operations were also essential to Osthoff Resorts system decision. Maestros mobile pre-check-in, express mobile check-out and digital signature capture will streamline our front office processes and eliminate several unnecessary points of physical guest contact, Hartenberger said. Plus, we can use Maestro web on tablets to check-in guests remotely in different parts of the property for greater guest convenience. This mobile flexibility and the automated built in guest email communications will enable us to achieve our goal of going nearly paperless. Even our housekeeping staff will use tablets for instant communication to get guests to their rooms faster with less paper.

Maestros mobile housekeeping also supports a soft-check-in feature allowing guests to be checked into their reservation if the room is not yet ready. It will set a priority clean alert for housekeeping, allow the guest to enjoy on property amenities and post charges to their folio, as well as update both the guest and the front desk when the room is ready. The Osthoff Resort will also take advantage of the Maestro prepayment portal for online guest self payments as well as the integrated online guest survey system. This offers them both a post check-in and check-out survey while integrating the results directly into the guest profile, allowing for pro-active guest management.

The Osthoff Resort is a year-round operation, but its staff fluctuates to host its many large events that include its award-wining Christmas Market, and Jazz on the Vine concert series. This makes system ease-of-use and online training essential. Maestro offers instant Live Chat Support & Training directly from any application screen, Hartenberger said. Also, eLearning modules within the system make it much easier to onboard new staff during our large events. Ease-of-use is important because our team is the front line of guest service and they need to be proficient with our systems. Altogether, Maestro was the best choice for our complex operation.

Warren Dehan, Maestro PMS President said, The Osthoff Resort is a great example of how independent resorts can leverage this time to evaluate their systems and take advantage of the latest innovations from trusted solutions providers in preparation for the new and even more digital guest experience. With the ever-changing needs the industry dictates, offering support to an exhaustive list of third-party tech partners will also help enhance the digital guest journey and internal operations. Maestro is pleased to be part of The Osthoff Resorts 21st century solutions upgrade.

# # #

About MaestroMaestro is the preferred cloud hosted and on-premise PMS solution for independent hotels, luxury resorts, conference centers, vacation rentals, and multi-property groups. Maestros PCI certified and EMV ready enterprise system offers 20+ integrated modules on a single database including touchless and mobile apps to increase profitability, drive direct bookings, centralize operations, and enable operators to engage guests with a safe and personalized experience. For over 42 years Maestros Diamond Plus Service has provided unparalleled 24/7 North American based support and education services to keep hospitality groups operational and productive. Click here for more information on Maestro. Click here to get your free PMS Buying guide.

About The Osthoff ResortRated one of the best lakeside resorts in Wisconsin to spend summer vacation, you're going to love Your Place on the Lake. The Osthoff Resort is set on 500 feet of Elkhart Lake's pristine shoreline. Discover spacious suites, cozy surroundings, a variety of restaurants and dining options, fun things to do for the whole family, a world-class spa, a cooking school, and one of the most beautiful venues for hosting a wedding, conference, or family reunion. Please click here for information.

Share article on social media or email:

View post:
A-List Osthoff Resort Positions for Travel Restart; Selects Maestro PMS During Industry Pause for Guest-Focused Mobile Services and Integrated...

Read More..

Berkeley Lab Technologies Honored With 7 R&D 100 Awards – Lawrence Berkeley National Laboratory

Innovative technologies from Lawrence Berkeley National Laboratory (Berkeley Lab) to achieve higher energy efficiency in buildings, make lithium batteries safer and higher performing, and secure quantum communications were some of the inventions honored with R&D 100 Awards by R&D World magazine.

For more than 50 years, the annual R&D 100 Awards have recognized 100 technologies of the past year deemed most innovative and disruptive by an independent panel of judges. The full list of winners, announced by parent company WTWH Media LLC is available at the R&D World website.

Berkeley Labs award-winning technologies are described below.

A Tool to Accelerate Electrochemical and Solid-State Innovation

(from left) Adam Weber, New Danilovic, Douglas Kushner, and John Petrovick (Credit: Berkeley Lab)

Berkeley Lab scientists invented a microelectrode cell to analyze and test electrochemical systems with solid electrolytes. Thanks to significant cost and performance advantages, this tool can accelerate development of critical applications such as energy storage and conversion (fuel cells, batteries, electrolyzers), carbon capture, desalination, and industrial decarbonization.

Solid electrolytes have been displacing liquid electrolytes as the focus of electrochemical innovation because of their performance, safety, and cost advantages. However, the lack of effective methods and equipment for studying solid electrolytes has hindered advancement of the technologies that employ them. This microelectrode cell meets the testing needs, and is already being used by Berkeley Lab scientists.

The development team includes Berkeley Lab researchers Adam Weber, Nemanja Danilovic, Douglas Kushner, and John Petrovick.

Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com)

Information transmitted by MMQ-Com is impervious to security breaches. (Credit: Alexander Stibor/Berkeley Lab)

Quantum communication, cybersecurity, and quantum computing are growing global markets. But the safety of our data is in peril given the rise of quantum computers that can decode classical encryption schemes.

The Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com) technology is a fundamentally new kind of secure quantum information transmitter. It transmits messages by modulating electron matter-waves without changing the pathways of the electrons. This secure communication method is inherently impervious to any interception attempt.

A novel quantum key distribution scheme also ensures that the signal is protected from spying by other quantum devices.

The development team includes Alexander Stibor of Berkeley Labs Molecular Foundry along with Robin Rpke and Nicole Kerker of the University of Tbingen in Germany.

Solid Lithium Battery Using Hard and Soft Solid Electrolytes

(from left) Marca Doeff, Guoying Chen, and Eongyu Yi (Credit: Berkeley Lab)

The lithium battery market is expected to grow from more than $37 billion in 2019 to more than $94 billion by 2025. However, the liquid electrolytes used in most commercial lithium-ion batteries are flammable and limit the ability to achieve higher energy densities. Safety issues continue to plague the electronics markets, as often-reported lithium battery fires and explosions result in casualties and financial losses.

In Berkeley Labs solid lithium battery, the organic electrolytic solution is replaced by two solid electrolytes, one soft and one hard, and lithium metal is used in place of the graphite anode. In addition to eliminating battery fires, incorporation of a lithium metal anode with a capacity 10 times higher than graphite (the conventional anode material in lithium-ion batteries) provides much higher energy densities.

The technology was developed by Berkeley Lab scientists Marca Doeff, Guoying Chen, and Eongyu Yi, along with collaborators at Montana State University.

Porous Graphitic Frameworks for Sustainable High-Performance Li-Ion Batteries

High-resolution transmission electron microscopy images of the Berkeley Lab PGF cathode reveal (at left) a highly ordered honeycomb structure within the 2D plane, and (at right) layered columnar arrays stacked perpendicular to the 2D plane. (Credit: Yi Liu/Berkeley Lab)

The Porous Graphitic Frameworks (PGF) technology is a lithium-ion battery cathode that could outperform todays cathodes in sustainability and performance.

In contrast to commercial cathodes, organic PGFs pose fewer risks to the environment because they are metal-free and composed of earth-abundant, lightweight organic elements such as carbon, hydrogen, and nitrogen. The PGF production process is also more energy-efficient and eco-friendly than other cathode technologies because they are prepared in water at mild temperatures, rather than in toxic solvents at high temperatures.

PGF cathodes also display stable charge-discharge cycles with ultrahigh capacity and record-high energy density, both of which are much higher than all commercial inorganic cathodes and organic cathodes known to exist.

The development team includes Yi Liu and Xinie Li of Berkeley Labs Molecular Foundry, as well as Hongxia Wang and Hao Chen of Stanford University.

Building Efficiency Targeting Tool for Energy Retrofits (BETTER)

The buildings sector is the largest source of primary energy consumption (40%) and ranks second after the industrial sector as a global source of direct and indirect carbon dioxide emissions from fuel combustion. According to the World Economic Forum, nearly one-half of all energy consumed by buildings could be avoided with new energy-efficient systems and equipment.

(from left) Carolyn Szum (Lead Researcher), Han Li, Chao Ding, Nan Zhou, Xu Liu (Credit: Berkeley Lab)

The Building Efficiency Targeting Tool for Energy Retrofits (BETTER) allows municipalities, building and portfolio owners and managers, and energy service providers to quickly and easily identify the most effective cost-saving and energy-efficiency measures in their buildings. With an open-source, data-driven analytical engine, BETTER uses readily available building and monthly energy data to quantify energy, cost, and greenhouse gas reduction potential, and to recommend efficiency interventions at the building and portfolio levels to capture that potential.

It is estimated that BETTER will help reduce about 165.8 megatons of carbon dioxide equivalent (MtCO2e) globally by 2030. This is equivalent to the CO2 sequestered by growing 2.7 billion tree seedlings for 10 years.

The development team includes Berkeley Lab scientists Nan Zhou, Carolyn Szum, Han Li, Chao Ding, Xu Liu, and William Huang, along with collaborators from Johnson Controls and ICF.

AmanziATS: Modeling Environmental Systems Across Scales

Simulated surface and subsurface water from Amanzi-ATS hydrological modeling of the Copper Creek sub-catchment in the East River, Colorado watershed. (Credit: Zexuan Xu/Berkeley Lab, David Moulton/Los Alamos National Laboratory)

Scientists use computer simulations to predict the impact of wildfires on water quality, or to monitor cleanup at nuclear waste remediation sites by portraying fluid flow across Earth compartments. The Amanzi-Advanced Terrestrial Simulator (ATS) enables them to replicate or couple multiple complex and integrated physical processes controlling these flowpaths, making it possible to capture the essential physics of the problem at hand.

Specific problems require taking an individual approach to simulations, said Sergi Molins, principal investigator at Berkeley Lab, which contributed expertise in geochemical modeling to the softwares development. Physical processes controlling how mountainous watersheds respond to disturbances such as climate- and land-use change, extreme weather, and wildfire are far different than the physical processes at play when an unexpected storm suddenly impacts groundwater contaminant levels in and around a nuclear remediation site. Amanzi-ATS allows scientists to make sense of these interactions in each individual scenario.

The code is open-source and capable of being run on systems ranging from a laptop to a supercomputer. Led by Los Alamos National Laboratory, Amanzi-ATS is jointly developed by researchers from Los Alamos National Laboratory, Oak Ridge National Laboratory, Pacific Northwest National Laboratory, and Berkeley Lab researchers including Sergi Molins, Marcus Day, Carl Steefel, and Zexuan Xu.

Institute for the Design of Advanced Energy Systems (IDAES)

The U.S. Department of Energys (DOEs) Institute for the Design of Advanced Energy Systems (IDAES) project develops next-generation computational tools for process systems engineering (PSE) of advanced energy systems, enabling their rapid design and optimization.

IDAES Project Team (Credit: Berkeley Lab)

By providing rigorous modeling capabilities, the IDAES Modeling & Optimization Platform helps energy and process companies, technology developers, academic researchers, and DOE to design, develop, scale-up, and analyze new and potential PSE technologies and processes to accelerate advances and apply them to address the nations energy needs. The IDAES platform is also a key component in the National Alliance for Water Innovation, a $100 million, five-year DOE innovation hub led by Berkeley Lab, which will examine the critical technical barriers and research needed to radically lower the cost and energy of desalination.

Led by National Energy Technology Laboratory, IDAES is a collaboration with Sandia National Laboratories, Berkeley Lab, West Virginia University, Carnegie Mellon University, and the University of Notre Dame. The development team at Berkeley Lab includes Deb Agarwal, Oluwamayowa (Mayo) Amusat, Keith Beattie, Ludovico Bianchi, Josh Boverhof, Hamdy Elgammal, Dan Gunter, Julianne Mueller, Jangho Park, Makayla Shepherd, Karen Whitenack, and Perren Yang.

# # #

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Labs facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energys Office of Science.

DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

View original post here:
Berkeley Lab Technologies Honored With 7 R&D 100 Awards - Lawrence Berkeley National Laboratory

Read More..

Three Advantages Of Using Cloud Computing In Business – CIO Applications

Most cloud providers include other services into the per-user costs like internet bills, help desk services, security, and support, which helps businesses reduce their on-site storage resources to cut costs without affecting what the private cloud offers.

FREMONT, CA: Cloud computing is an alternate data storage using traditional data centers server storage that depends on internet-based hosting for data and application. Also known as serverless computing, files and applications are stored in a virtual cloud that enables access across all devices from anywhere using machine learning principles.

Here are three advantages of using cloud computing:

Security

Keeping data storage protected is crucial, and many cloud computing providers make security a priority by providing significant data protection. Public and private cloud providers cannot afford to lose customers with inferior security, therefore providing superior, multi-factor authentication and c=security certificates to patch management. Private cloud providers offer a hands-on approach, making sure practices comply with the companys goals and objectives.

Cost Savings

Cost-saving is fundamental for most companies, and cloud computing storage can offer few benefits to it. Many cloud computing services are hosted by third-party service providers, both public and private cloud solutions charge on a per-user basis. Expenses depend on how many people need access to the cloud, unlike traditional server setup in which the prices are flat, whether there is one or 1000 users.

It also reduces the number of IT FTEs needed to manage servers. All maintenance issues are managed by cloud service providers instead of hiring people to ensure that everything works properly, saving cost and time.

Most cloud providers include other services into the per-user costs like internet bills, help desk services, security, and support, which helps businesses reduce their on-site storage resources to cut costs without affecting what the private cloud offers.

Mobility

Cloud computing resources allow files, programs, applications, and data to be accessible from anywhere with any device with an internet connection. Additionally, it can also negate systems incompatibilities.

The mobility of access can allow employees to work from anywhere, optimizing productivity, and flexibility. It also improves customer satisfaction as it enables customers and clients to have access to information and reliable service, increasing the chances of loyalty to the brand.

Excerpt from:
Three Advantages Of Using Cloud Computing In Business - CIO Applications

Read More..

How secure is the cloud in 2020? – Techerati

Despite increasing levels of adoption byorganisationsof all sizes, cloud solutions continue to be plagued bymisconceptions about their security. Its still commonly assumed that the cloud offers a less secure optioncompared toon-premises infrastructure.So, how does it really shape up, and what security challenges face the cloud in 2020?

While businessesthatkeep their dataon-site often feel as though they have more control over its security, the flaw in that plan is usually a lack of in-house expertise.Thecyber-skills gapis widely documented, leavingalmost half of UK businesses unable to deal with even basic security tasks.Unless you can afford adedicated,specialisedon-site security team, chances are your data would be as safe, if not safer, storedby a public cloud provider with access to the best resources and expertise.

The UK 2020 Databerg Reportshowsthat the perception of cloud security isslowlychanging. In 2015, 77% of businesses expressed concerns about cloud security,and this has seen someimprovement over the last five years although59% remain unconvinced. According to the report, the likely reason this mistrustpersists is unconnected to the physical capabilities of the cloud. Instead it lieswiththe fact that data stored in the cloud always remains the responsibility of theorganisation, rather than the cloud provider. Ifa data breach occurs, the financial and reputational repercussionsfall directly to theorganisation.

Its therefore important that businesses seek out a vendor they can trust, with the knowledge and expertise to best securetheirsensitivedata. Equally important is that businesses educate their employees onbest-practiceprotocols and procedures human error remains one of the biggest causes of data breaches. With the rise of remote working and BYOD, more and more dataisaccessed via the cloud, making it harder fororganisationsto keep an eye on data security. In 2020,31% of employeestook business information outside of theorganisationvia cloud storage, up from 21% in 2015.

While the cloud does not inherently provide additional risk, it remains a target for cybercriminals.Backinthe beginning of2020, as hackers scrabbled to take advantage of the pandemic,cloud-based attacks rose by 630%between January and April.InVerizons 2020 Data Breach InvestigationsReport however, we can see thatcloud security still comparesfavourablyto on-premises alternatives. This year, cloud assets accounted for 24% of breaches, compared to 70% of on-prem assets.

Of those cloud breaches recorded, 77% involvedcompromisedcredentials.Rather than being a demonstration ofinherentweakness in the clouds security, this serves to illustrate the huge growth in social engineering attacks, such asphishing scams, that aim to steal privileged access credentials. The quickest and easiest way for cybercriminals to access systems (cloud-based or otherwise), credentialtheftis fast becoming one of the worst offenders for causes of data breaches. According to thelatestPonemon Institute Cost of a Data Breach Report, a fifth of all data breaches are now the result of stolen or compromised credentials.Worryingly, this was found to impact the average cost of a breachby almost $1 million.

ThePonemonreport also states that misconfigured cloud servers tie with compromised credentials as the most frequentthreat vector.This is confirmed by Verizons findings which show thatmisconfiguration errors have increased since 2017, to the point where they are now more common than malware and outranked onlyby hacking.

With these statistics in mind, its easy to understand whythose59% oforganisationsremain wary of the cloud.The result ofhumanerror during setup, cloud misconfigurationcan leave data exposed or present vulnerabilities that couldlaterbe exploited by threat actors. Its important to make sure your cloud is configured byexperts, regularly audited, updated and patched.Responsibility and configuration of the cloud is sharedbetweenanorganisationand their service provider, so its important to make sure youre working with the right partner.

UKFast offers a truly unique, tailored approach to cloud hosting, with a range of public and private cloud servers and a team of security specialists on hand to keep your data safe. Speak to a cloud expert today on 0800 073 0317.

Read more:
How secure is the cloud in 2020? - Techerati

Read More..

Cloud computing is betting on outer space – Mint

The Redmond headquartered company, however, has competition in the skies. Almost five months earlier, International Business Machines Corp. (IBM) had announced a beta of its Cloud Satellite service. But it is Amazon Web Services Inc. (AWS), the cloud computing arm of Amazon.com, which has a head start in space.

Around two years ago, it launched the AWS Ground Station to allow its customers to control their satellite communications, process data, and scale operations without having to build or manage their own ground station infrastructure. On 30 June, AWS said it was establishing a new space unit called the Aerospace and Satellite Solutions.

These are but a few cases in point to demonstrate that leading cloud computing service providers have begun flexing their muscles in space too. But why is there a sudden race to outer space?

According to the International Telecommunication Union (ITU), non-geostationary satellite orbits (NGSOs) such as medium earth orbits (MEO) and low earth orbits (LEO) are being increasingly used worldwide. NGSOs, unlike fixed or geostationary satellite orbits, move across the sky during their orbit around the earth. With space launches becoming more affordable and accessible, a slew of private companies are starting to rely on this new array of satellites.

They are used for applications like weather forecasting, surface imaging, communications, and video broadcasts. However, the data from these satellites need to be processed and analysed in data centres on the ground, which explains the term ground stations.

While the cost of the satellite itself is falling, building and running ground stations can cost up to $1 million or more, according to a recent blog post by Jeff Barr, chief evangelist for AWS. Complex data processing also requires a lot of computing power, and the huge data storage requirements only add to the cost.

Leading cloud computing service providers are now starting to offer satellite operators the option to use these ground stations on a pay-per-use or subscription basis, thus, helping the latter save on capital expenditure costs by employing an operating expenditure model.

View Full Image

These ground stations, thus, can help satellite operators download high-resolution imagery faster, more regularly, and analyse the data with artificial intelligence (AI) toolsall of which results in faster and enhanced monitoring of changing climate patterns, forests and agriculture, among other things.

While Microsoft and IBM are testing their services, AWS Ground Station already has customers such as NASAs Jet Propulsion Laboratory and satellite operators Iridium Communications and Spire Global. It also has private sector customers such as Lockheed Martin, Maxar Technologies and Capella Space.

Lucrative market

The worldwide cloud infrastructure services market continued to surge in the April-June quarter of this calendar year to touch $34.6 billion, according to research firm Canalys. The growth was attributed to the consumption of cloud-based services for online collaboration and remote working tools, e-commerce, remote learning, and content streaming which hit new records during the lockdown.

During this period, AWS was the leading cloud service provider, accounting for 31% share of the total spend. Microsoft Azure came second, followed by Google Cloud and Alibaba Cloud.

The revenue of the cloud unit of Amazon totalled $10.81 billion in the April-June quarter of this calendar year, accounting for 12% of its parents revenue.

Microsoft, on the other hand, said its commercial cloud surpassed $50 billion in annual revenue for the first time" for the quarter ended June 30 (which is also its financial year ending). But it does not spell out what this commercial cloud consists of.

Nevertheless, the space forays will only add to the revenue of all these companies.

Battle lines in India

Space deals will add spice in India too. Indias cloud computing market was estimated at $2.5 billion in 2018, dominated by infrastructure as a service (IaaS) and software as a service (SaaS), according to industry body Nasscom. It is forecast to touch over $7 Billion in 2022.

AWS, Microsoft and Google are leaders on the local turf too. Last August, for instance, Microsoft signed a deal with Reliance Jio Infocomm Limited (Jio)a subsidiary of Mukesh Ambani-owned Reliance Industries Ltd (RIL). The agreement included deploying the Microsoft Azure cloud platform in Jios data centers in locations across India.

This January, Google Cloud signed a deal with Bharti Airtel to cater to small and medium enterprises (SMEs) in India. However, Google said this July that it was pumping in $4.5 billion into Airtels rival Jio Platforms in exchange for a 7.7% stake. Not surprisingly, a month later, Bharti Airtel announced a multi-year agreement with AWS to deliver cloud solutions to big companies and SMEs in India.

According to Alok Shende, Managing Director of Ascentius Insights, the fusion of cloud computing with networking, linked by a satellite, is expected to shave off milliseconds in transferring data from source to destination. This is the holy grail in many applications, more specifically in finance and in mission-critical applications. There are many India-centric applications (like defence and in the stock markets) where this could play a powerful role."

He believes that for Microsoft, particularly, this move opens a new avenue to entrench itself in the enterprise market where it has traditionally been a strong player on the application side but has lost the leadership position in terms of market share for cloud."

Jayanth Kolla, founder and partner of Convergence Catalyst points out that India has always been a strong player in the space sector with the Indian Space Research Organization (Isro) developing and launching satellites at a fraction of global costs. He believes that the Indian governments decision to open up Indias space sector to private players is an encouraging sign.

It has already resulted in Indian space tech startups such as Pixxel, Bellatrix Aerospace, Vesta Space and Agnikul raising over $20 million funding from venture capitalists (VCs) in the last six months. TV media, agriculture, telemedicine and logistics are a few sectors that can benefit from strong satellite communication and space technology development. The ground station services launch by Microsoft and AWS will only expedite this ecosystem development significantly in India," says Kolla.

Sanchit Vir Gogia, chief analyst and founder of Greyhound Research, concurs that the timing of this space move is right since many organizations are now beginning to try new use-cases by tapping into geospatial data (data related to a specific location on earth) that is omnipresent, given the proliferation of devices and edge computing devices.

This space is increasingly getting busy with the likes of AWS and IBM investing money and resources to cater to this opportunity," notes Gogia. He cautions, however: We believe the trick in making such an offering successful is to ensure that it is cheap to start with, since most of these projects are nothing more than trials and, hence, have an extremely high failure rate."

The distributed cloud

Space is just an additional frontier for the leading cloud services providers. It all began when companies, which traditionally used servers for their computing needs, realised that they could lower costs by accessing IT resources over the internet, and paying only for the services they needed, reducing capexa trend we now know as cloud computing.

Many companies today use private clouds (on-premise), public clouds (on a network, typically the internet) and hybrid clouds (combining public and private). User companies, though, became wise and began adopting a multi-cloud vendor approach to avoid being locked in by any single technology or cloud vendor.

With billions of devices getting connected to each other as part of the Internet of Things (IoT) trend, computing is now also getting done at the so-called edge", which simply means near the source of the data.

General Electric Co. (GE), for instance, believes cloud computing is best suited to situations that demand actions such as significant computing power, management of huge data volumes from across plants, asset health monitoring and machine learning. Edge computing, on the other hand, makes sense in places like mines or offshore oil platforms that have bandwidth constraints, which make it impractical or very expensive to transmit data from machines to the cloud.

During his speech at the Ignite event, for instance, Nadella pointed out that Microsoft was extending Azure from under the sea to outer space". He was referring to Project Natick that aims to serve customers in areas near large bodies of water. Natick uses AI to monitor signs of failure in its servers and other equipment.

Going forward, Microsoft says it will explore powering a Natick data center by a co-located ocean-based green power system, such as offshore wind or tide, with no grid connection".

Similarly, other than deploying internet balloons in space to provide broadband services, Google also provides services to companies like Planet Labs Inc. The US-based aerospace and data analytics company uses Google Cloud platform to process all of its satellite images and Google Cloud storage to host its image archive.

These moves have given rise to a trend called Distributed Cloud, which research firm Gartner describes as distribution of public cloud services to different physical locations".

By 2023, posits a 22 January note by Gartner, the leading cloud service providers will have a distributed ATM-like presence to serve a subset of their services for low-latency application requirements... Micro data centers will be located in areas where a high population of users congregates, while pop-up cloud service points will support temporary requirements like sporting events and concerts."

Greyhound Research believes offerings such as ground stations will be highly valuable in the next wave of investments in more distributed computing environments. More than 7 in 10 of our end-user inquiries with global majors have confirmed that organizations, in the next 3-5 years, will use a large variety of computing environments and make them more contextual to the use-case," says Gogia. This change is likely to be paced multiple times, given the investments in edge networks and 5G that allow remote sites in utilities, oil and gas, manufacturing, and many other scenarios," he adds.

The distributed cloud market is forecast to reach $3.9 billion by 2025, growing at a CAGR of 24.1% during the forecast period from 2020-2025, according to market research firm, IndustryARC. Security, though, remains a concern if proper protocols and policies are not adhered to in a distributed cloud.

For now, though, ground stations that cater to satellite companies will remain one big component of the distribution cloud. A race is clearly on and all the main players are looking up at the sky.

Leslie DMonte is a consultant who writes on the intersection of science and technology

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Read the original here:
Cloud computing is betting on outer space - Mint

Read More..

VMware wants to play nice with Nvidia DPUs Blocks and Files – Blocks and Files

VMware and Nvidia announced yesterday they are working to make VMware software work better with Nvidia chips. They say the joint initiative, dubbed Project Monterey, will introduce a new security model that offloads hypervisor, networking, security and storage tasks from the CPU to the DPU.

The aim is to offload hypervisor, networking, security and storage tasks from a host CPU to Nvidias BlueField data processing unit (DPU). This should be useful for AI, machine learning, and high-throughput, data-centric applications, according to the companies.

Nvidia CEO Jensen Huang said in the launch announcement:Nvidia DPUs will give companies the ability to build secure, programmable, software-defined data centres that can accelerate all enterprise applications at exceptional value.

Paul Perez, SVP and CTO, Infrastructure Solutions Group at Dell Technologies, also provided a statement: We believe the enterprise of the future will comprise a disaggregated and composable environment.

Dell said VMware Cloud Foundation will be able to maintain compute virtualization on the server CPU while offloading networking and storage I/O functions to the SmartNIC CPU. VMware has taken the first step to achieve this by enabling VMware ESXi to run on SmartNICs.

A SmartNIC or DPU is a programmable co-processor that runs non-application tasks from a server CPU, so enabling the server to run more applications faster. DPUs can compose disaggregated data centre server compute, networking and storage resources. They can also function as intelligent network interface cards that provide security services and network acceleration.

Nvidias BlueField-2 is a Mellanox system-on-chip (SoC) that integrates a ConnectX-6 Dx ASIC network adapter with a PCIe Gen 4 x16 lane switch, 2 x 25/50/100 GbitE or 1 x 200GbitE ports, and an array of 8-core, 64-bit Arm processors. This provides an integrated crypto engine for IPsec and TLS cryptography, integrated RDMA and NVMe-oF acceleration, and dedupe and compression.

Three use cases are envisaged. First, BlueField-2 can be used with disaggregated storage, which it virtualizes and enables remote, networked storage to be part of a composable infrastructure. Second, BlueField-2 can provision bare metal servers as a CSP operator service to cloud tenants.

VMware said it will re-architect VMware Cloud Foundation to enable disaggregation of the server including support for bare metal servers, a new Cloud Foundation facility. It will enable an application running on one physical server to consume hardware accelerator resources such as FPGAs from other physical servers.

With ESXi running on the SmartNIC, customers will be able to use a single management framework to manage all their virtualized and bare metal compute infrastructure.

Thirdly, BlueField-2 can be used for micro-segmentation at endpoints to isolate application workloads and their resources from each other.

There is a security aspect to Project Monterey. Each SmartNIC is capable of running a fully-featured stateful firewall and advanced security suite. Up to thousands of tiny firewalls will be able to be deployed and automatically tuned to protect specific application services that make up the application.

Project Monterey is available as preview code.

VMware is collaborating with Intel, Nvidia and Pensando, and system vendors Dell, HPE and Lenovo to deliver Project Monterey systems. Dell said it could deliver automated systems using SmartNICS from a broad set of vendors.

DPU suppliers include three startups: Fungible, Nebulon, and Pensando. Pensando recently announced it will provide its DPU as a factory-supported option on HPE servers across the VMware Cloud Foundation product line, including vSphere, VSAN, and NSX. Customers will be able to access Pensandos platform directly within VMware hardware.

Separately, VMware announced at VMworld 2020 yesterday that it is jointly building a deployment platform for VMware-controlled servers to run AI software on attached Nvidia A100 GPUs. The platform combines VMwares vSphere, Cloud Foundation and Tanzu container orchestration software with Nvidias NGC software.

NGC (Nvidia GPU Cloud) is a website catalogue of GPU-optimised software for deep learning, machine learning, and high performance computing. NGC software is supported on a select set of pre-tested Nvidia A100-powered servers expected from leading system manufacturers.

Read more:
VMware wants to play nice with Nvidia DPUs Blocks and Files - Blocks and Files

Read More..

Industry Groups Spar Over NDAA Provisions on Sourcing of Electronics from China – Nextgov

Manufacturers and assemblers of printed circuit boards are standing apart from other major industry groups in praising sections of the National Defense Authorization Act that would require defense contractors to use less and less of such equipment from adversarial nations over time.

For years, domestic industry has diminished in size and power while other countries, including China, have invested heavily in bolstering their own industrial capabilities, reads a Sept. 29 letter IPC, a trade association of the manufacturers, sent to the chair and ranking members of the House and Senate Armed Services committees. As a result, DoD today relies on nonallied producers for [printed circuit boards and printed circuit board assemblies] in areas including cloud servers, IT, and telecom networks. This continued reliance on untrusted foreign suppliers for [printed circuit boards and printed circuit board assemblies] poses numerous risks to national security.

Both the House- and Senate-passed versions of the NDAA would require Defense contractors to use increasingly more of such equipment from U.S. manufacturers or those of allied countries. Under the House bill 100% of printed circuit boards and printed circuit board assemblies would come from those covered countries by 2033. The Senate bill calls for full sourcing from covered countries by 2032 and 25% of the equipment coming from trusted countries by 2023.

The Senate bill also explicitly bars procurement of the equipment from China, Russia, Iran and North Korea, all of which have been designated as posing world-wide threats by the intelligence community.

We urge you to speedily resolve any remaining issues between the House and Senate while keeping in mind the strong protections that passed both chambers with overwhelming support and without any vocal opposition, IPC wrote.

A conference committee which will reconcile the House and Senate bills before a final vote on the legislation has not yet been formed, staff from the office of Rep. Jim Langevin, D-R.I., chairman of an Armed Services subcommittee on emerging threats, told Nextgov.

The NDAA provisionsSec. 808 in the Senate bill and Sec. 826 in the House billdid spur opposition from a broader group of companies in advance of those negotiations.

ARWG remains concerned with the broad applicability and programmatic impact of the House and Senate provisions related to printed circuit board (PCB) procurement, reads a Sept. 24 letter the Acquisition Reform Working Group also sent to the leaders of the Armed Services Committees in both chambers.

ARWG includes the Associated General Contractors of America, the Information Technology Industry Council, the Computing Technology Industry Association, the National Defense Industrial Association, the American Council of Engineering Companies and the United States Chamber of Commerce.

ARWG recommends the conferees direct the Secretary of Defense to implement a design verification standard to ensure that [printed circuit boards] present no national security risk regarding counterfeiting, quality, or unauthorized access, the associations collectively wrote. Subsequent to this submission, ARWG will provide specific recommendations on these matters separately.

The House and Senate bills both contain a number of ways companies might receive waivers from the provisions from the Secretary of Defense, including if the secretary determines the covered equipment poses no significant national security concerns regarding counterfeiting, quality, or unauthorized access.

Chris Mitchell, IPCs vice president for global government affairs told Nextgov the request for a design verification standard and the provisions in the NDAA bills are not mutually exclusive, noting we should and have been working to develop standards along the lines of whats laid out in the [ARWG] letter.

He added that IPC separately has a trusted supplier standardIPC 1791that the group believes is a model for DOD. And he stressed that while the design verification standard has merit, it doesnt fully address either the security issue or those of the resilience of the industrial base in general, with 55% of printed circuit board production happening in China.

A Sept. 16 blog post from IPC President John Mitchell asserts, The opposition fears the new requirements will disrupt their established supply chains in countries that are not affirmatively covered.

The status quo may be advantageous to some, but this is not a compelling enough reason to nullify a major step forward for American manufacturing capabilities, the IPC letter reads. [the NDAA provisions] would create new high-skilled workforce opportunities for U.S. workers and provide trusted supplies to the U.S. government to use in critical applications.

More here:
Industry Groups Spar Over NDAA Provisions on Sourcing of Electronics from China - Nextgov

Read More..