Page 1,139«..1020..1,1381,1391,1401,141..1,1501,160..»

Edge Computing Impact: What Does It Do? – Dataconomy

The advent of edge computing is poised to have a significant influence on various industries, exerting its effect on both current and forthcoming verticals. Although certain sectors have already experienced the initial waves of this impact, others are anticipated to adopt it at a relatively slower pace. Consequently, telecommunications companies must exercise caution and prudence in their vertical selection process, ensuring that they choose the most appropriate target area in light of the edge computing impact.

Edge computing refers to a distributed computing paradigm that brings data processing and computation closer to the edge of the network, in close proximity to the data source or end-users. Unlike traditional centralized computing models, where data is sent to a remote data center or the cloud for processing, edge computing enables data processing and analysis to occur at or near the point of data generation.

In edge computing, small-scale data centers, known as edge nodes or edge devices, are deployed at the network edge. These nodes can include routers, gateways, servers, or IoT devices. By processing data locally at the edge, edge computing reduces latency, improves real-time responsiveness, and enhances overall system performance.

The key idea behind edge computing is to bring computation closer to the data source, which offers several advantages. It enables faster data analysis and decision-making, reduces reliance on the cloud for processing and storage, and minimizes the amount of data that needs to be transmitted over the network. This approach is particularly beneficial in scenarios where real-time processing, low latency, bandwidth efficiency, and data privacy are critical requirements, such as IoT applications, autonomous vehicles, industrial automation, and smart cities.

Edge computing has the potential to revolutionize various industries by enabling new use cases and applications that require low-latency data processing, real-time analytics, and localized decision-making. It complements cloud computing by providing a decentralized and distributed computing infrastructure that brings computational power and intelligence closer to where it is needed, unlocking the full potential of emerging technologies and enabling innovative solutions.

The main characteristics of edge computing encompass:

There exist four distinct categories of Edge Computing, each serving different purposes within the overall framework.

By categorizing edge computing into these four types, we can better understand the diverse implementations and their respective roles within the broader landscape.

PlanetScale introduces serverless driver for JavaScript: Databases are moving to the edge

Fog computing serves as an extension of cloud networks, which consist of interconnected servers forming a distributed network infrastructure. These networks empower organizations to surpass the resource limitations they would otherwise encounter. A primary advantage of cloud networks lies in their ability to gather data from diverse sources, making it accessible from anywhere via the internet. While fog computing shares similarities with cloud networks, both involving intelligent data processing at the time of creation, a crucial distinction exists between the two in terms of intelligence and computing power.

Fog computing places greater emphasis on intelligence within the local area network (LAN) environment. In this architecture, data originating from endpoints is transmitted to a gateway, which subsequently routes it to appropriate sources for processing. The processed data is then returned to the transmission path. Conversely, cloud networks prioritize computing power and data processing at the edge of the network. This entails performing computational tasks on embedded computing platforms that interface with sensors and controllers.

By deploying fog computing, organizations can leverage local intelligence within their LAN, facilitating efficient and localized processing of data. This approach is particularly beneficial in scenarios where real-time responsiveness, low latency, and optimized resource utilization are critical factors. On the other hand, cloud networks excel in providing substantial computing power and enabling data processing on a broader scale, with a focus on centralized cloud-based infrastructure.

Key takeaways:

The emergence of edge computing is poised to have a substantial and far-reaching influence on various industries. While certain verticals have already experienced the initial effects of this transformative technology, others are expected to be slower in adopting edge computing solutions. Understanding this disparity in adoption readiness is crucial for operators seeking to capitalize on the edge computing opportunity and expand revenue streams beyond core connectivity offerings.

The advancements in edge computing, characterized by reduced latency, improved reliability, enhanced security, and increased mobility, unlock a plethora of new use cases across various industries. One prominent example lies in the realm of security solutions, where the deployment of edge computing infrastructure enables video ingest and analytics at the network edge, leading to significant impacts.

As the prevalence of video surveillance continues to rise, there is a corresponding surge in data volumes generated by the growing number of cameras and the improved quality of video recordings. Edge computing effectively addresses the challenge posed by the escalating data volumes by decentralizing traffic and analysis, enabling on-site processing in real-time for monitoring purposes or triggering alarms.

The latency requirements for real-time processing make it impractical to rely solely on cloud-based solutions. By leveraging edge computing, the necessary functionalities can be performed locally at the network edge, ensuring minimal latency and immediate response. Additionally, conducting these operations at the edge enhances data security, as sensitive information is processed and stored closer to its source, reducing the risk associated with transmitting data to a centralized cloud infrastructure.

Thus, edge computing plays a vital role in meeting the demands of processing and analyzing video data, offering improved security, lower latency, and real-time insights for effective monitoring and alarm triggering in the face of growing data volumes and evolving surveillance requirements.

Telcos can assess potential target verticals for edge computing solutions based on various metrics to determine the most attractive opportunities. Here are several key metrics that telcos could consider.

Indeed, the contribution of an industry to the GDP of a country (or countries) where a telco operates can serve as a useful indicator of its ability and willingness to invest in digital solutions. When telcos consider offering edge computing solutions, the financial aspect plays a crucial role, as significant investments are required. Evaluating the target verticals spending capacity becomes essential for telcos aiming to maximize return on investment (ROI).

Industries that make a substantial contribution to a countrys GDP often possess greater financial resources and a stronger appetite for digital transformation. These industries are more likely to prioritize and allocate budget towards innovative solutions such as edge computing. By targeting verticals with a higher GDP contribution, telcos increase their chances of engaging with industries that are financially capable and inclined to invest in digital advancements.

Using GDP as a proxy for spending capacity provides a useful framework for telcos to assess the potential ROI of their edge computing offerings. It helps identify verticals where the likelihood of securing investment and achieving revenue growth is higher, aligning with the telcos strategic objectives and financial sustainability.

IIoT and edge computing are gaining traction in many industries

Telcos venturing into offering verticalized edge solutions must possess a solid comprehension of the pain points faced by enterprise customers in the present context. Additionally, establishing strong anchor customers with whom they can collaborate to test and develop new solutions becomes crucial. Leveraging existing industry expertise and relationships serves as a valuable starting point for telcos in this endeavor.

For instance, TELUS, with its strong vertical focus on healthcare through TELUS Health, can leverage its industry knowledge and relationships to explore an edge-enabled approach within the healthcare sector. By understanding the specific challenges faced by healthcare organizations, TELUS can tailor edge computing solutions to address their pain points and provide enhanced services and experiences.

Similarly, Verizon, known for its strong vertical presence in the transportation industry through Verizon Connect, can leverage its expertise to pursue edge-enabled opportunities within the transport sector. Building upon their existing industry relationships and understanding the unique requirements of the transportation industry, Verizon can develop and deploy edge computing solutions that cater to the specific needs of transportation companies.

Industries that have reached a higher level of digital maturity are more likely to adopt edge computing-enabled solutions earlier. This is because certain prerequisites, such as having operational data stored in a database rather than manually recorded, are necessary for edge solutions to deliver value. Several indicators can be used to measure digital maturity, including digital spending, the level of digitization in business processes, and the extent of work digitization.

By focusing on industries that are already digitally mature, telcos can strategically target their efforts towards sectors that are better prepared for the adoption of edge computing. Leveraging the existing digital infrastructure and capabilities of these industries, telcos can position themselves as partners in their digital transformation journey, offering tailored edge computing solutions to further enhance operational efficiency, data analytics, and real-time decision-making.

10 edge computing innovators to keep an eye on in 2023

When telcos aim to offer verticalized edge solutions, it is crucial for them to assess enterprise verticals based on various factors in order to determine the optimal target. By evaluating these factors, telcos can make informed decisions and identify the right verticals to focus their efforts on. Here are some key factors to consider:

Key takeaways:

Embracing the boundless possibilities of edge computing, telcos embark on a remarkable journey to weave a tapestry of technological marvels across industries. With an astute understanding of enterprise pain points and a collaborative spirit, telcos forge strong alliances with anchor customers, together venturing into uncharted territories of innovation. Guided by their industry expertise, telcos sculpt tailored edge solutions that seamlessly address challenges, empower businesses, and create transformative experiences.

Like a virtuoso conductor, telcos orchestrate the symphony of edge computing impact, harmonizing connectivity, speed, and intelligence. Through this technological symposium, industries witness a metamorphosis, transcending limitations and unleashing unprecedented possibilities. With each interaction, edge computings influence radiates, amplifying the pulse of real-time analytics, unlocking the gates to low latency, and defying the constraints of conventional computing.

Continued here:
Edge Computing Impact: What Does It Do? - Dataconomy

Read More..

Size of the Prize: Assessing the Market for Edge Computing in Space – Via Satellite

Via Satellite illustration.

This is the second of a two-part series analyzing the value of edge computing in space by the Boston Consulting Group. Read Part One:Size of the Prize: How Will Edge Computing in Space Drive Value Creation?

What key drivers are necessary to ensure that edge computing in space is widely adopted to the degree that it reaches an inflection point of affordability? We at the Boston Consulting Group believe that cybersecurity, cost, and ESG will drive the market for edge computing in space.

Cybersecurity is an area in which edge computing offers distinct advantages. Cloud computing is vulnerable to the ever-increasing risk of cybersecurity breaches, which can lead to major data theft or loss. Organizations across industries that collect personally identifiable information on a public cloud expose themselves to liability and/or compliance concerns, while sensitive intellectual property and proprietary industry data can become vulnerable to cybersecurity attacks at various nodes of transmission particularly given growing dependency on cloud computing.

The main challenge presented by the current cloud computing landscape is that corporate services and data are entrusted to third parties and are exposed to a higher level of risk, both in terms of security and privacy. The top three threats to cloud systems are unsafe API interfaces, data loss or theft, and hardware failure. The widespread use of virtualization in the implementation of cloud infrastructure also creates security problems because it alters the relationship between operating systems and underlying hardware, introducing an additional level that must be managed and protected.

In contrast, edge computing introduces multiple advantages for cybersecurity since data is processed locally. This eliminates risks stemming from data transfers, which are typically encrypted and inevitable when using typical terrestrial cloud solutions. With edge computing, complex calculations occur at the IoT device/perimeter server level and the only transfer is that of the final result to the user. The risk of data loss is driven more by damage to local servers, rather than cybersecurity vulnerabilities.

[Hear more: Sita Sonty talks with N2K Space about edge computing in the T-Minus Space Daily podcast]

Cost also presents an area of advantage to edge computing. Organizations could achieve operational cost savings by using edge computing due to the minimal need to move data to the cloud. Since data is processed at the same location where it is generated (in this case, on the satellites themselves, collecting imagery through hyperspectral or SAR capability or remote sensing data), processing these batches of data on the same satellite would also yield a significant reduction in the bandwidth needed to handle the data load.

Hosting applications and data on centralized hosting platforms or centers creates latency when users try to use them over the internet. Large physical distances coupled with network congestion or outages can delay data movement across the network. This then delays any analytics and decision-making processes.

Edge computing in space, in this context, could enable data to be accessed and processed with little or no obstacles, even when there is poor internet connectivity. Importantly, if there is failure with one edge device, it will not destroy the operation of the other edge devices in the ecosystem, facilitating a reliable, connected system.

Finally, there are potential gains to be achieved in terms of ESG metrics by adopting in-space edge computing capability. With the cloud business model dominating, there are emerging concerns about the environmental effects of centralized processing. Processing centers require enormous resources to function; they contribute to carbon emissions, accounting for0.6% of all greenhouse gas emissions, and produce electronic waste, adding to the burden humans put on the environment in pursuit of advancement.

Edge computing has become a potential alternative to moving data centers to greener practices. The edge helps reduce the networking traffic coming in and out of centralized servers, reducing bandwidth and energy drains. This frees up bandwidth at the data center itself and bandwidth for the organization, overall, in terms of any centralized servers on-premises. Moving edge computing to space would achieve even further reductions in energy consumption required at the terrestrial data center level, while the needs for temperature control and cooling would be eliminated by the freezing temperatures in LEO.

In order to estimate an overall market for edge computing in space and explain why in-space edge computing capability and associated user interface applications need to be built, we triangulated three approaches to the market: Supply, Demand, and Cost.

Today, roughly 20% of data processing and analysis occurs locally, with 80% happening in centralized data centers and computing facilities.

We developed ahigh, low, and base casefor estimating the share of industry addressable by space solutions, and as a core assumption of the model, we used reliance on cybersecurity to gauge what share of industry would be addressable by space. With this model, we expectan estimated $250 million market by 2030with defense and satcom as leading industries for application. However, it is important to note that the estimated $250 million market is addressed by only one segment of the total scope available as one looks at the Edge Computing in Space Capability Stack (Figure 2).

Figure 2: The capability stack for edge computing in space demonstrates the breadth of functions which could be enabled and supported for different end users. Source: BCG analysis.

Further upside would emerge as addressable market opportunity for connectivity service providers (satcom/telecom), applications developers (who would be responsible for developing the apps for the specific government customer to interpret processed information, for example); terminals/user interface manufacturers; and the residual flow down to data centers for cloud computing purposes. Other segments of Edge Computing in Space Capability Stack would see further value unlocked as Edge in Space comes online, delivers key capabilities to the highest need customer groups (e.g., those in defense), and brings the cost curve down for commercial use cases and applications to emerge.

By estimating demand for cloud computing across target industries, supply for satellite revenue in the aggregate space market, and comparing the cost of terrestrial and space data storage centers, we believe that there is more demand for cloud computing in space than the supply of satcom providers.

Our model indicates that the cost to host data in space will closely approach terrestrial data costs past 2030, while on supply and demand, we anticipate more demand for cloud computing space than supply of satcom providers.

In light of these differentiating factors and our model research, demand for edge computing is established and expected to grow (Figure 3). We project all of Satellite IoT spending, $1.5 billion by 2030, to be addressable given the importance of cybersecurity. We estimate the relevant edge computing market (excluding hardware and non-core service software) to be $0.3 billion by 2030, of which 75% would be in-scope. Finally, we estimate up to 2% of the total $1.2 billion cloud compute market by 2030 to be in-scope due to the selective applicability of cybersecurity and latency needs for real-time analysis.

Figure 3: Demand Model Methodology and Driver Tree

However, research indicates that supply is currently lagging behind expected need due to insufficient public and private investment, with key implications for government and private investors.

The key drivers to understanding which companies will unlock the potential of edge computing in space include prioritizing cybersecurity, lowering cost burden, and adopting ESG practices. With increasing digitalization, the space economy will further benefit from integrating edge computing into space-based business models. However, companies and governments must help develop the needed supply that our current space investment demands.

While cloud computing will remain an integral part of the overall market for the foreseeable future, the advantages offered by edge computing in space are clear enough that actors in the most promising markets of defense and agriculture should be considering the questions posed earlier. For government, how can they leverage this technology to enhance the security of critical assets and information? How should government invest in developing the market for space-based edge computing, and how can they effectively support its growth? What role will incentives play will they be tied to ESG targets?

For industry, there are questions around how to sell to target customers in key markets such as government and agriculture. Are the start-up and non-recurring engineering costs prohibitive and what investments and partnerships will be required? What scenarios exist for the development of requisite ground infrastructure?

Go to market success will require integrating the edge computing in space-as-a-service capability into a suite of other services that could already be on offer. In addition, as commercial space stations look to develop edge computing in space offerings, successful methods will integrate this capability among others in orbit, such as where and how remote sensors collect the data, where and how the data analytics are performed, and potentially offering various data streams to the same group(s) of customers utilizing the same sensors to optimize quality and quantity of output.

The space industry is no stranger to partnering closely with suppliers and customers, including governments, to develop and deliver new technology and advance the art of the possible. By making the right investments, governments, investors, and users in edge computing can turn democratizing space from an expression into a reality.

This paper is the second two-part series analyzing the value of edge computing in space by the Boston Consulting Group. Read Part One:Size of the Prize: How Will Edge Computing in Space Drive Value Creation?

S. Sita Sonty leads Boston Consulting Groups Commercial Space team. John Wenstrup is a senior leader in BCGs Technology, Media & Telecommunications practice. Cameron Scott is Global Sector Lead for Defense and Security. AndDr. Hillary Child is a Project Leader from BCGs Chicago office.

Additional research by Avril Prakash, Sarvani Yellayi, Ansh Prasad, and John Kim

Originally posted here:
Size of the Prize: Assessing the Market for Edge Computing in Space - Via Satellite

Read More..

Tokenization of Investment Fund Units – Lexology

Tokenization of investment funds offers efficiency, cost reduction, compliance improvements, liquidity, transparency, and innovation, but faces legal, technological, and market challenges.

1. Introduction

When it comes to asset tokenization, it is essential to choose a blockchain that allows applications to be built directly on the blockchain. These applications are software programs known as smart contracts, which may serve almost any purpose. Given these two distinct but connected layers, it is important to distinguish between the protocol layer and the application layer.

Theprotocol layerrefers to the blockchain as the underlying infrastructure. Article 973d of the Swiss Code of Obligations (CO) designates this layer (including further layers, cf. 2.2 below) as a securities ledger, which is appropriate as blockchain is one type of distributed ledger technology (DLT). Blockchain can be described as a decentralized and cryptographically secured database in a peer-to-peer network. A blockchain can only be extended chronologically, for which consensus among the nodes is required. Due to these features, a blockchain is considered immutable.

Furthermore, nowadays, scalability solutions are now being built on top of most blockchains with the aim of making transactions faster, cheaper and more efficient. Strictly speaking, these scalability solutions form a separate layer as smart contracts (incl. DApps) can be built on top of such or be connected to such solutions. However, for the sake of simplicity of this short overview, such scalability solutions will not be further explained herein.

Theapplication layerrefers to smart contracts. Depending on its purpose, a smart contract might mint tokens which could reflect the value of an asset, entitle the token holder to a membership right or use, or represent ownership of an item. Depending on the use case and, in particular, for the tokenization of financial instruments in Switzerland, an in-depth examination of the financial marketregulations and art. 973d et seq. CO must be performed prior to processing with the tokenization of real assets. Failure to comply with financial market regulations can have severe consequences and may result in high fines and a forced liquidation of the company.

2. Tokenization in General

2.1. How does Tokenization Work

In simplified terms, tokenization can be described as the process of digitizing an asset by creating a blockchain-based token. Combining this description with the Federal Department of Finances (FDF) understanding, tokenization can be defined as the creation of a digital representation of a digital or non-digital asset that is electronically registered and therefore tradable on the blockchain. Consequently, this digital representation of a digital or non-digital asset constitutes a token.

With this in mind, it would be reasonable to conclude that a token can be defined as a representation or linkage of a digital or non-digital asset. However, and with view to the BCP-Framework, which was introduced by MME in 2018[1]and is shown below, this is only true for so-called Asset Tokens, Counterparty Tokens, or Ownership Tokens.

The categories of Asset Tokens, Counterparty Tokens, or Ownership Tokens. truly represent a non-digital asset, e.g., a relative right or an absolute right to an asset. Accordingly, the represented right can only be transferred if the token is transferred, as the respective right and the token are inseparably interlinked as further explained below.

However, in this regard it must be noted, that the Swiss Financial Market Supervisory Authority (FINMA) does not distinguish Asset Tokens into further categories (cf. paragraph below).

2.2. Legal framework

In addition to the financial market regulations, which primarily aim to protect investors, in Switzerland, the Collective Investment Schemes Act (CISA) is the authoritative law when launching a Swiss-based investment fund. It might therefore be surprising that the CISA is irrelevant for the tokenization process of investment funds units. Hence, neither the fund structure nor its method of distribution (e.g., listed funds) are relevant for the tokenization of the investment fund unit. However and make no mistake - the CISA applies to all Swiss-based investment funds regardless of whether their units are tokenized or not. For the legal qualification of a tokenized investment fund unit, however, other laws are relevant which is why some compare tokenization to securitization.

Tokenization and securitization may be compared to the extent that all claims (relative rights) can be securitized and therefore also tokenized in the sense of a digital securitization. Nonetheless, securitization refers to the representation of a relative right in either negotiable securities (Wertpapiere) or an entry in a centrally kept register and hence in uncertificated securities (einfache Wertrechte)or in intermediated securities (Bucheffekten). In case of corporate membership rights, however, securitization is only possible where the law permits it, i.e., in case of companies limited by shares (Aktiengesellschaft) and partnerships limited by shares (Kommanditaktiengesellschaft).

Conversely, tokenization goes further than securitization in the sense that not only securitizable rights/claims can be tokenized, but also non-securitizable rights, such as absolute rights or other membership rights, including investment funds units. Furthermore, tokenizing an asset aims to result in a ledger-based security (Registerwertrecht)and hence in a right that is electronically registered on adecentralizedledger the blockchain.

From a private law perspective, tokenized shares or investment fund units generally qualify as ledger-based securities, provided the securities ledger (the blockchain including its respective layer [scalability solution and smart contract]) meets the requirements of art. 973d para. 2 CO. The category of ledger-based securities was specifically created in 2021 to recognize tokenized assets within Switzerlands legal framework.

However, contrary to the term ledger-based security, not every tokenized share or investment fund unit automatically qualifies as a security in the sense of financial market law. For a tokenized share or unit to constitute a security under securities law, the following must be met cumulatively:

(i) the tokenized right must be transferable only through the token (securitization);

(ii) the token must be publicly offered in the same structure and denomination (standardization); and

(iii) the tokens must be fungible among each other (fungibility).

If the aforementioned is not met, the token will not qualify as a security but as a financial instrument, hence the requirements of securities law are not applicable. For example, an issuer of a financial instrument which does not qualify as a security is not subject to prospectus obligations. However, the consequences are much more far-reaching because, for example, the criminal provisions under financial market law on insider trading and market manipulation only apply to securities.

3. Benefits and challenges of tokenization of Investment fund units

Tokenization of investment fund units can revolutionize the way investors access and trade assets. The benefits of tokenization are numerous, for investors, issuers and investment fund managers alike and include the following:

3.1. Benefits

3.1.1. Increased Efficiency

Blockchain technology removes the need for (financial) intermediaries by providing a decentralized and transparent ledger for transferring, verifying, and clearing of transactions. This automatically results in more efficient transactions as transactions are settled and cleared within seconds. Furthermore, blockchain technology allows a high level of automation through smart contracts, which may execute transactions automatically based on predefined conditions. The transfer of fund units, including the settlement and clearing, can thereby be conducted instantly and at the same time. In addition, customized features can be programmed directly into the units (i.e. whitelisting, freeze, unfreeze, destroy and recreate, corporate actions, ban of specific countries etc.).

3.1.2. Reduced Cost

As mentioned, intermediaries become obsolete in a blockchain ecosystem, or, where they are still necessary, usually get a new role (e.g., banks). Consequently, tokenization will result in cost savings for the issuance of financial instruments as well as other processes, including corporate actions, reconciliation, and trading on the secondary market. Likewise, the possibility for automation as well as transparency in keeping record may significantly reduce costs for issuers, investors, and investment fund managers alike. Automation further significantly reduces the risk of errors. Accordingly, the management of complex compliance requirements becomes significantly cheaper, especially because specific rules can be programmed directly into each token.

3.1.3. Improved Compliance

Tokenized investment units may significantly improve compliance or at least facilitate the compliance management of an investment fund provided the necessary infrastructure exists. For example, both the financial markets regulations as well as the CISA provide for a mandatory segmentation of investor categories some investment fund units may therefore only be offered to professional, qualified, or institutional investors but not to retail investors. By tokenizing an investment fund unit, it is possible to code such compliance rules into the token or into the smart contract (depending on the chosen blockchain protocol), i.e., by labeling the token as a unit meant only for professional investors. As a result, and in conjunction with a whitelist or blacklist (e.g. segmentation of investors into the above categories), such a token may only be traded by an investor who qualifies as a professional investor.

Similarly, other and individual compliance rules can be included in a tokenized investment fund units, such as trading halts or sanctioned individuals or countries. As blockchain and tokenization continue to gain traction, the adoption of this technology for compliance purposes is expected to offer transformative benefits across various industries and use cases.

3.1.4. Increased Transparency

The blockchain technology provides a distributed, immutable, and transparent ledger to record transactions and can thereby provide a single source of truth for all parties involved, improving transparency, and reducing disputes around record keeping. As a result, the utilization of blockchain technology is expected to result in enhanced efficiency and reliability in the trading, settlement and clearing of transactions. Blockchain technology further enables tracking and traceability of tokenized assets throughout their lifecycle, as each token representing an asset can be uniquely identified and recorded on the blockchain. This allows for transparent tracking of ownership, transfers, and other relevant information and provides a clear and auditable record of an assets history, assisting to prevent fraud, forgery, and other illicit activities.

Furthermore, tokenization of assets paves the way for asset management 2.0, as smart contracts can be programmed to invest according to a pre-programmed risk appetite and portfolio diversity, without the need for human interaction.

3.1.5. Improved Liquidity

Tokenization can improve liquidity of investment fund units (and all other financial instruments) in two respects:

(i) Firstly, most investment funds are not listed and therefore tend to be illiquid. By issuing tokenized investment fund units, in theory, these units become immediately tradable on the blockchain making an illiquid product liquid as at least the possibility for a facilitated exchange of such investment fund units exists. In practice, however, and in particular to comply with the relevant laws, a respective trading venue is required.

(ii) Secondly, and this is mostly true for private equity fund units/instruments, traditionally, participating in a private fund or venture requires investing a considerable amount. Such large tickets can be daunting and require a substantial commitment, especially considering that the investment amount is usually subject to a lock-up period. Tokenization provides a remedy as a large ticket can be tokenized and be divided into several smaller tickets. The potential use cases are basically infinite and mostly depend on the fund structure (contractual vs. corporate fund structure or open vs. closed fund structure) as well as on the respective governing documents such as a (limited) partnership agreement or the fund agreement. These documents are ultimately also decisive for the question of who will tokenize the investment fund units or who will divide a large ticket into smaller ones. Theoretically, and as an example, the fund management may only issue tokenized fund units, or a large investor may tokenize and divide larger tickets into smaller ones.

Furthermore, tokenization also assists financial inclusion, as many people in developing countries do not have a bank account, let alone a trading account. However, almost everyone has a smart phone including people in developing countries, provided there is stable and affordable internet connection. By tokenizing financial instruments, such instruments can also be made accessible in developing countries, since all that is needed to trade tokenized assets is a registered wallet on a smartphone and provided this is permissible by the applicable legal framework. Hence, more accessibility automatically leads to increased liquidity.

3.1.6. Facilitated Innovation

Tokenization has the potential to revolutionize the investment landscape by allowing for the creation of novel and innovative investment products, including fractionalized real estate, liquid revenue share agreements, dynamic ETFs, and other previously unmanageable offerings. This can expand investment opportunities for investors and generate new revenue streams for issuers, ushering in a new era of investment possibilities.

3.2. Challenges

While tokenization offers numerous benefits, there are also challenges that need to be addressed. Challenges may include the following:

3.2.1. Legal and Regulatory Challenges

Tokenization presents unique and novel legal and regulatory complexities and requires, among other things, compliance with securities laws and regulations. The legal and regulatory frameworks surrounding tokenization in Switzerland are in the international context highly advanced and exemplary but still evolving, especially when it comes to their interpretation by the regulators. Consequently, it is necessary that issuers take measures to ensure compliance with all applicable laws and regulations.

3.2.2. Technological Challenges

Tokenization introduces novel technological hurdles, such as the imperative for robust cybersecurity protocols and the risk of technological disruptions that could impact asset trading and settlement/clearing processes. In addition, the increased use of blockchain technology may cause tokenization to face scalability issues, as networks may be limited in terms of transaction processing speed and capacity. Other issues may arise from the lack of interoperability or vulnerabilities due to coding errors or in the underlying blockchain technology itself. Therefore, in-dept knowledge and understanding of the underlying technology, its strategy as well as scalability solutions or other projects running on the blockchain are indispensable. Each project may or may not have a certain influence on the respective blockchain ecosystem as a whole.

3.2.3. Market Adoption Challenges

The concept of tokenization still is relatively new, and it may take some time for it to be widely adopted. Issuers may need to educate investors on the advantages of tokenization and strive to establish trust in this technology. Furthermore, trading on the secondary markets still is a challenge today, as such exchanges hardly exist. Nevertheless, this challenge equally exists in the traditional financial system as pure liquidity is only certain if a market maker exists.

3.3. Balancing Benefits and Challenges

Tokenization presents a plethora of advantages for both issuers and investors, such as enhanced efficiency, cost reduction, improved compliance, heightened liquidity, increased transparency, and facilitated innovation. Nevertheless, there are also obstacles that must be tackled, including legal and regulatory issues, technological complexities, and market adoption challenges. As the legal and regulatory frameworks for tokenization evolve, issuers and investors can anticipate growing opportunities for investment and expansion in this promising emerging field.

4. Conclusion

When tokenizing real assets, the jurisdiction must always be taken into account, as various jurisdictions have different approaches and regulatory regimes some are more favorable towards tokenization than others. It is advisable to contact a law firm with a corresponding track record before launching a tokenization project in order to avoid unwanted proceedings or penalties from regulatory authorities.

For more information on tokenization of private equity funds in Switzerland by example of the Limited Qualified Investor Fund:click here.

Go here to see the original:

Tokenization of Investment Fund Units - Lexology

Read More..

The Future Of DeFi: Exploring BNB, Cardano, and Caged Beasts | Bitcoinist.com – Bitcoinist

As the world of decentralized finance (DeFi) continues to evolve, the spotlight falls on three prominent players: BNB, Cardano, and Caged Beasts. Join us on a journey to explore the future of DeFi and uncover the answer to the burning question: Which of these platforms will shape the financial landscape of tomorrow?

BNB, also known as Binance Coin, has emerged as a prominent cryptocurrency within the digital asset realm. Introduced in 2017 by the renowned exchange Binance, BNB has quickly solidified its position as a significant player in the market.

Primarily functioning as a utility token, BNB serves various purposes within the Binance ecosystem. It grants users discounted trading fees, unlocks access to exclusive features, and facilitates seamless transactions on the Binance platform. Beyond Binance, BNB has expanded its utility across decentralized applications (dApps) and decentralized finance (DeFi) platforms.

The future of DeFi holds great potential for BNB. Its vital role within the Binance ecosystem positions it as a fundamental currency for trading and accessing diverse services, establishing itself as an integral component of the DeFi infrastructure.

Additionally, BNB has embraced blockchain interoperability through its integration with the Binance Smart Chain (BSC). This interoperability enables developers to build decentralized applications and deploy smart contracts, providing an alternative to Ethereum with the added benefits of reduced transaction costs and quicker confirmations.

As the DeFi landscape continues to evolve, BNBs multi-faceted functionality, ecosystem integration, and commitment to innovation position it as a key player that can shape the future of decentralized finance.

Cardano (ADA) is a blockchain platform with the potential to revolutionize decentralized finance (DeFi). Developed by a team of academics and engineers, Cardano stands out for its scientific approach and robust infrastructure. It prioritizes peer-reviewed protocols, scalability, and sustainability.

Cardanos layered architecture and unique separation of settlement and computation layers enhance scalability, enabling faster and more cost-effective transactions. Its native programming language, Plutus, facilitates secure and complex smart contract development, reducing vulnerabilities.

Interoperability is a key strength of Cardano, allowing seamless integration with other blockchains. This fosters collaboration and opens up possibilities for diverse DeFi applications.

Cardanos commitment to inclusivity and sustainability sets it apart. It prioritizes ethical and eco-friendly solutions while providing equal access to financial services.

With its scientific rigor, scalability, smart contract capabilities, interoperability, and focus on sustainability, Cardano is poised to shape the future of DeFi.

Caged Beasts, a new and intriguing meme coin, has the potential to shape the future of decentralized finance (DeFi) through its unique concept and immersive experience. By introducing caged beasts as representations of each BEASTS token, Caged Beasts aims to cultivate an army of powerful creatures that can disrupt the financial landscape.

The project goes beyond being just a cryptocurrency, offering a captivating narrative set in an animal testing lab. With each stage of the presale, these caged animals undergo a transformation fueled by mutagens, cybernetics, and weaponry. The goal is to unleash them into the crypto world and challenge the dominance of traditional financial systems.

What sets Caged Beasts apart is its strong emphasis on community engagement. By locking 75% of the funds until the release date, the project ensures controlled distribution and fosters trust and transparency. Additionally, allocating 25% of funds to the marketing wallet demonstrates a commitment to raising brand awareness and attracting new participants.

As a brand-new meme coin, Caged Beasts presents an exciting investment opportunity, especially during its early growth phase. Joining the presale enables investors to secure their position early and potentially benefit from the future developments and impact of Caged Beasts in the world of DeFi.

For More About Caged Beasts:

Website: https://cagedbeasts.comTwitter: https://twitter.com/CAGED_BEASTSTelegram: https://t.me/CAGEDBEASTS

Disclaimer:This is a paid release. The statements, views and opinions expressed in this column are solely those of the content provider and do not necessarily represent those of Bitcoinist. Bitcoinist does not guarantee the accuracy or timeliness of information available in such content. Do your research and invest at your own risk.

See the rest here:

The Future Of DeFi: Exploring BNB, Cardano, and Caged Beasts | Bitcoinist.com - Bitcoinist

Read More..

Bitcoin NFT Blockchain Holds Second Space Defeating Solana – The Coin Republic

The current sales volume of Bitcoin is around $173 million, while that of Ethereum is $ 391 million. Solana is securing the third position with a $53 million sales volume. Recently Bitcoin Frogs were popular and hit a record high transaction. Bitcoin blockchain has been popularized more because of Bitcoin frogs, a Meme coin NFT token.

A blockchain is best known for storing the data of transactions. They can not be altered because there is no way to change the blocks. Bitcoin blockchain has exploded into various cryptocurrencies, NFTs, DeFi, and smart contracts.

Bitcoins blockchain is decentralized so that there is no authority, and transactions are immutable and viewable to all. Bitcoin NFTs have been here for some time. But it is gaining growth at a faster rate.

Gone are the times when we hear that popular NFTs are linked with Ethereum or Solana. Now, the most popular NFTs are linked with Bitcoin blockchains. NFTs on Bitcoin are unique and secured. The first NFT released on Bitcoin is Rare Pepes.

NFTs can be created on Bitcoin with the Bitcoin Ordinals and other base layers. These layers can have smart contracts and can settle the transactions on the base layer. Bitcoin NFTs are inexpensive, sustainable, and scalable due to layers. They make minting and exchanging affordable and secure.

The famous BRC-20 tokens like ORDI, MEME, and PEPE have no utility attached. They were issued just for experimental purposes. Twitter user DOMO created ORDI to showcase the BRC-20 functionality that is created by ordinals.

The market is now changed to media-based ordinals. Hence shift from BRC-20 to Bitcoin NFTs is seen. In the past week, bitcoin-based NFTs were in the top 3. It was after the Bitcoin Frog Hype.

The top Bitcoin NFTs are space pepes, bitcoin frogs, and $ ORDI BRC 20 NFTs. The space pepes has sales of approximately $7M, bitcoin frog of approximately $5M, and $ ORDIBRC 20 NFTs of approximately $2M. BITAmigos is the top NFT in the last 24 hours.

In the first half of May, BRC-20 meme coins reached the $1 billion mark. The current market capitalization of BRC-20 tokens is $447 million.

Currently, the market capitalization of over BRC-20 tokens stands at $447 million.

Comparatively, the Bitcoin ecosystem appears a lot behind Ethereum. Bitcoin supports decentralized autonomous organizations.

It has the advantage of using integration with the $27.5 billion-strong decentralized finance ecosystem. BAYC and Azuki are much more advanced. They have metaverse projects which provide exclusive add-on to NFT holders.

CryptoPunks enjoy popularity due to their rarity. It is still on test to see how popular Bitcoin NFTs will remain. They still need to build their strong community. The market of bitcoin NFTs has been going up since January. Galaxy estimates this could be worth $4.5 billion by 2025.

Nancy J. Allen is a crypto enthusiast and believes that cryptocurrencies inspire people to be their own banks and step aside from traditional monetary exchange systems. She is also intrigued by blockchain technology and its functioning.

See the rest here:

Bitcoin NFT Blockchain Holds Second Space Defeating Solana - The Coin Republic

Read More..

How Banks Plan to Use AI to Boost Web3 Adoption – BeInCrypto

Investment bank Goldman Sachs and Microsoft want to boost Web3 uptake on the Canton blockchain through artificial intelligence (AI).

The duo joins traditional finance (TradFi) giants Deloitte, S&P Global, Moodys, BNP Paribas, and Cboe Global Markets in building infrastructure during the crypto bear market.

The recently-released Canton Network links the trading platforms of Goldman and Deutsche Brse, whose notional volumes exceed the trading activity of many crypto assets.

The network is built on Microsofts Azure cloud. The consortium hopes to attract developers with the new digital asset smart contract language.

Microsoft said last month it wants to increase Web3 users with artificial intelligence on Canton. The firm said yesterday that added Bing to OpenAIs ChatGPT Plus premium service.

AI can analyze app usage patterns to help Web3 firms elevate user experiences. Firstly, it can assess a products weak points and help users easily pick up from where they left off.

In addition, the technology can also streamline complex tasks like decentralized governance and token management. AI can also improve network management through automated data collection, decision-making, monitoring for malicious activity, and streamlining transaction processing.

Google, also a notable cloud and AI player, became a Solana validator last year. After that, it joined forces with the Tezos Foundation in February for similar reasons.

Googles deal with Polygon last month provides tooling and infrastructure empowering zero-knowledge projects. The Silicon Valley giant recently opened the preview of its PaLM 2 library to enable coders to add AI to their applications.

TradFi firms envision real-world asset tokenization as their next goal, as banks can benefit from faster asset transfers on blockchains. According to Cathy Clay of Cboe, Canton can help create new market infrastructure and drive efficiency in the trading of products across the globe.

Early efforts have tokenized valuable assets like real estate, vehicles, or fiat for fast transfer across blockchains. BlackRock CEO Larry Fink told shareholders the bank would tokenize stocks and bonds this year.

Previously, JPMorgan Chase exchanged tokenized Japanese yen and U.S. dollars using a permissioned Aave pool whose access was governed by credentials in smart contracts.

For BeInCryptos latestBitcoin(BTC) analysis,click here.

In adherence to the Trust Project guidelines, BeInCrypto is committed to unbiased, transparent reporting. This news article aims to provide accurate, timely information. However, readers are advised to verify facts independently and consult with a professional before making any decisions based on this content.

Read more from the original source:

How Banks Plan to Use AI to Boost Web3 Adoption - BeInCrypto

Read More..

Vitalik Buterin calls DFINITY Ethereum’s sister network EDCON … – Cryptopolitan

During the EDCON 2023 conference, Vitalik Buterin called the The Internet Computernetwork Ethereums sister network refuting competition claims. In the world of blockchain technology and crypto, Ethereum has emerged as a prominent player, revolutionizing the way we think about decentralized applications and smart contracts.

However, another platform has recently gained significant attention in the crypto community, positioning itself as Ethereums sister network: the DFINITY project. DFINITY is a groundbreaking blockchain platform that aims to provide a highly scalable and efficient decentralized computing network. Created by a team of visionary developers, DFINITY offers a unique approach to blockchain technology.

Ethereum and DFINITY both strive to create a decentralized future, empowering individuals and businesses with the ability to interact directly, securely, and without intermediaries. While Ethereum has established itself as a leader in the field, DFINITY aims to complement its capabilities by addressing some of the scalability challenges that Ethereum currently faces.

When it comes to Layer 1s that allow smart contracts, Ethereum is without a doubt the most popular. With a market valuation of approximately 216 billion dollars and 70 billion dollars in multiple DeFi protocols, Ethereums success has surely given door to many technical advances that other cryptocurrencies employ today.

DFINITY, the future public blockchain-based cloud computing network that will seed a decentralized internet known as cloud 3.0. It is also completely compatible with the Ethereum Virtual Machine (EVM) and significantly reduces the cost of IT systems and business applications.

Its incredible that DFINITYs Blockchain Nervous System can mechanically adjust economic conditions as well as specific network settings to suit capacity needs. In brief, this network can evolve and address real-time issues, making it far more adaptable than present systems.

Ethereum is unquestionably the dominant decentralized public blockchain platform for running smart contracts. The Ethereum smart contract and EVM components added credibility to the technology. In addition, it possesses all the fundamental advantages that the conventional blockchain lacks. Here are the top 5 differences between Ethereum and Dfinity.

1. On-chain governance VS off-chain governance The The Internet Computer BNS is a built-in on-chain governance structure, whereas Ethereum discussions and decisions must be conducted off-chain.

2. Proof of Stake (PoS) VS Proof of Work (PoW) PoW is the requirement that block creators solve cryptographic riddles to earn the right to create a new block. PoS is a system that allows you to become the next bookmaker based on the stake fraction you own or deposit into the system.

In comparison to PoS, the PoW system requires more costly computations. Therefore, individuals produce evidence of their stake, which DFINITY will use. In certain circumstances, Ethereum uses both the PoW and the PoS. Note that Ethereum has transitioned to PoS.

3. Security over Lifeness VS Lifeness over Security Typically, DFINITY is so-called threshold groups, in which a random selection of 400 IDs manifest blocks and generate unique threshold signatures. When the threshold is not met, the entire system fails. In such circumstances, Ethereum takes a different approach using the proof of work architecture, which prioritizes life over security. The The Internet Computer, on the other hand, favors security over lifeness.

4. Fixed-sized deposits VS variably sized deposits There is a fixed sized deposit at DFINITY where the quantity of stake that you must deposit is fixed by the system and there is a requirement to create more than one ID. There are variously sized deposits in Ethereum where you can design the system with diverse affects.

As a result, in the never-ending war between The Internet Computer and Ethereum, both win according to different standards. It is also important to note that both are roughly equivalent in terms of their distinct individual characteristics.

5. Actor model VS serialized contract execution DFINITY, in general, follows the Actor Model because it allows for the execution of applications. This represents the fact that it provides both parallel contract execution and asynchronous message passing. On the other side, everything happens on Ethereum one after the other, and you need to store a lot of data in your memory, which puts a pressure on your machines memory.

Original post:

Vitalik Buterin calls DFINITY Ethereum's sister network EDCON ... - Cryptopolitan

Read More..

How Will Dogetti Fare When Bitcoin and Ethereum’s Price and … – Analytics Insight

In the fast-paced world of cryptocurrencies, Bitcoin (BTC) and Ethereum (ETH) have come out on top, captivating investors and enthusiasts alike with their groundbreaking technologies and forward-thinking attitudes. It was even announced, recently, that Bitcoins price is now braced for 3 billion users, with the CEO of Strike issuing a serious Coinbase and Ethereum warning.

This article aims to dissect the similarities and differences between these industry giants, exploring their impact on the market and shedding light on what this means for up-and-coming projects newly entering the market, such as Dogetti (DETI).

Bitcoin, the first decentralized digital currency, burst onto the global scene in 2009. It exists as a reaction to the 2008 Financial Crisis- a reaction to the traditional financial market which has been shown to fail its users. By creating a network where users work together, Bitcoin is able to remain stable, avoiding financial fates like the one it was created from.

Bitcoin operates on a peer-to-peer network, allowing users to conduct secure transactions without the need for intermediaries. With a finite supply of 21 million coins, Bitcoins scarcity has been a driving force behind its value, allowing the coin to dominate the market.

While Bitcoin blazed the trail, Ethereum introduced a revolutionary concept to the crypto landscape: smart contracts. Launched in 2015, Ethereum expanded the possibilities of blockchain technology by enabling developers to create decentralized applications (dApps) and execute programmable contracts. Ethereums native cryptocurrency, Ether, fuels the network, and is highly sought after for its utility within the Ethereum ecosystem.

Since its inception, the answer to the question, Is Ethereum a good coin to buy? has almost always been yes from the wider crypto community, with the network continuing to innovate to this day.

Bitcoin and Ethereum have exerted a significant influence on the broader cryptocurrency market. As the leading cryptocurrencies by market capitalization, their price movements often influence the industry as a whole. Both coins have experienced substantial price volatility, attracting investors looking to capitalize on market fluctuations.

Bitcoin, with its widespread recognition and established infrastructure, has become a digital asset that institutional investors and hedge funds are increasingly considering as a hedge against inflation. Its finite supply and growing acceptance in mainstream financial institutions have contributed to its status as a digital gold.

On the other hand, Ethereums impact stretches beyond being a mere cryptocurrency. Its underlying blockchain platform has become a foundation for countless innovative projects and decentralized finance (DeFi) applications. Ethereums smart contract functionality has unlocked new possibilities for fundraising through Initial Coin Offerings (ICOs) and tokenization of assets, revolutionizing the way businesses operate and raising the bar for technological advancements.

Amid the rise of Bitcoin and Ethereum, Dogetti, a soon-to-launch meme token, must navigate the evolving market dynamics.

Bitcoins prominence as a store of value and hedge against traditional financial uncertainties presents Dogetti with a unique opportunity: Bitcoins motivations are based on an event from over a decade ago, with seriousness in its operation. Dogetti, as a mob-themed dog token, has space to play and inject fun into crypto trading, giving it a specific appeal.

This is also the case with Ethereum. Ethereum has been designed with practicality and technological advancement at its forefront, making it somewhat challenging to get involved with, especially for crypto newcomers. Dogetti breaks down these walls with an easily accessible project, giving it opportunities to build a wide and dedicated community.

In the ever-changing landscape of cryptocurrencies, Bitcoin and Ethereum stand out as influential players, each with its own unique strengths and impact on the market. While Bitcoin serves as a store of value and medium of exchange, Ethereum has revolutionized the concept of smart contracts and decentralized applications. The impact of these cryptocurrencies transcends their individual networks, influencing the entire crypto market.

For Dogetti, understanding the implications of Bitcoin and Ethereum is crucial. By capitalizing on the popularity and utility of these coins, Dogetti can cement its position in the crypto world and provide its users with innovative solutions.

Dogetti is on pace to launch in just a few weeks time, offering its users a compelling opportunity to make the most of their investment. By using code LAUNCHDETI, users can expect a 400% token bonus at launch, making now the perfect time to get involved.

Presale: https://dogetti.io/how-to-buy

Website: https://dogetti.io/

Telegram: https://t.me/Dogetti

Twitter: https://twitter.com/_Dogetti_

The rest is here:

How Will Dogetti Fare When Bitcoin and Ethereum's Price and ... - Analytics Insight

Read More..

Ethereum (ETH) Staking Gains Momentum: $2,000 Price Target? – BeInCrypto

Ethereum (ETH) jumped 4% this week to clear the $1850 resistance. With investors accelerating on DeFi staking, ETH looks set to make more gains in the coming days. Can the bulls gain enough momentum to validate the bullish $2,000 ETH price prediction?

This week, the cryptocurrency market experienced a surge as Ethereum (ETH) and other Layer-1 coins made sizeable gains. On-chain data shows that the rise in staking activities among ETH holders is a critical factor behind the current rally.

Heres why Ethereum investors are holding out for more gains in the coming days.

This week, the percentage of ETH circulating supply staked across the ETH 2.0 mainnet and DeFi smart contracts has risen to a new all-time high.

The chart below shows that after the recent blip on May 17, Ethereum investors have staked an additional 430,500 ETH as of May 22.

The Supply in Smart Contracts metric tracks the percentage of a cryptocurrencys circulating supply that investors have locked up in various staking protocols. When it starts to increase, it cause a temporary shortage in market supply.

If Ethereum investors continue to stake at this rate, the recent ETH price surge could evolve into a prolonged bull rally.

Furthermore, the decline in ETH Network Value to Transaction Volume (NVT) ratio reveals that it is currently undervalued. The chart below shows how the Ethereum NVT ratio dropped 49% from 92.92 to 46.64 between May 20 and May 22.

Typically, strategic investors use the NVT ratio to assess the relationship between a cryptocurrencys market capitalization and the underlying transactional activity.

When the NVT ratio drops considerably, as observed above, it indicates that the asset is still undervalued and could be due to more price pumps.

In summary, the low NVT ratio could spur other investors to mirror the trades of the bullish whales. If that happens, the heightened demand could validate bullish ETH price predictions.

IntoTheBlocks In/Out of the Money Price Distribution data signals that ETH could soon reclaim the $2,000 milestone.

However, Ethereum could have difficulty breaking above the $1,925 resistance level. At that zone, 1.41 million investors holding 1.31 million ETH could sell when they break even around $1,925 and inadvertently trigger a pullback.

Nevertheless, as predicted, those holders could turn bullish if the bullish momentum strengthens. If that happens, ETH can break out and rally toward $2,100.

Still, the bullish Ethereum price prediction could be invalidated if ETH price drops below $1,800 again.

However, the 3.23 million investors that bought 4.86 million ETH at an average of $1,800 can offer some support.

If that support level cannot hold, ETH may drop to $1,750.

In line with the Trust Project guidelines, this price analysis article is for informational purposes only and should not be considered financial or investment advice. BeInCrypto is committed to accurate, unbiased reporting, but market conditions are subject to change without notice. Always conduct your own research and consult with a professional before making any financial decisions.

Read the original here:

Ethereum (ETH) Staking Gains Momentum: $2,000 Price Target? - BeInCrypto

Read More..

Legacy Overhaul & Data Mining at Top of Mind for Health Care … – GovCon Wire

Photo by Chor muang / Shutterstock.com

Due to the demands of the COVID-19 pandemic, the healthcare space has been pummeled over the last few years to answer the manifold and ever-growing needs of citizens. Many of these needs have solutions that can only be resolved through agencies information technology systems.

For us, during COVID, it was really about, how quickly can we leverage our growing capabilities in cloudWe stood up data analytics and visualization platforms. In partnership with Microsoft, we rapidly stood up a COVID bot on the website that enabled citizens to determine what they needed to do with regard to COVID symptoms. And we rapidly put in place border health monitoring capabilities, again, all leveraging cloud, said Centers for Disease Control Deputy Chief Information OfficerJason Bonander. The executive shared his thoughts during a recent event hosted by GovCon Wire.

Data administration and the replacement of legacy systems were the primary themes that arose in this discussion between private and public sector healthcare technology professionals. GCWs2nd Annual Healthcare IT Digital Transformation Forum featured Leidos Health Group Vice PresidentBobby Saxon as moderator for the Whats Next for Modernization panel.Leidos was also a Platinum sponsor of the May 17 event alongsideCore4ce and Silver sponsorUnanet.

If you missed the event, you can watch the full slate of discussion here. You can also browse and register for upcoming GCW eventshere and those from sister service Potomac Officers Clubhere.

Bonander went on to say that cloud is less a renegade modernization practice at this point than it is core infrastructure, a given. Still, he says it offers evergreen possibilities that the CDC is still mining in post-pandemic efforts where the agency has been prompted into realizing they must fundamentally modernize and change how [they] develop applications, how and where [they] host those applications. He says theyre operating at a quick pace in remaking and reimagining their offerings.

Fellow panelist Dr.Susan Monarez, deputy director of the Advanced Research Projects Agency Health, reported that her organization has the distinct benefit of being brand new, so it doesnt have any legacy systems to overturn. With regard to making progress in the government health technology space, Monarez said that the goal is to avoid getting into a situation where something that could be realized via an innovative tool or strategy in a matter of three to five years ends up taking 10 to 20 years.

To do so, Monarez says ARPA-H is attempting to break new ground in its field.

How do we take core concepts for the health ecosystem we talk about, from the molecular to the societal, and think about the problems in ways that just are fundamentally different than the way that folks have been thinking about the problemsHow could we actually start to address those problems in a way that hasnt been done before? Monarez explained of the research and development hubs mindset.

When Saxon queried the panelists about how their organizations are aiding decision support, Bonander said its about turning the massive amounts of data the CDC receives and processes into actionable insights in a way that benefits state and local health partners as well as the average citizen.

Data is the lifeblood of public health, Bonander stated, while Monarez said she loses a lot of sleep from excitement in thinking about the possibilities of creating a full-scale way to make health sector data actionable.

Core4ce CEOJack Wilmer noted the utility of artificial intelligence technologies in sorting through large amounts of data to boost decision making, but also said that its important to implement explainable AI whose choices and determinations can be legible to the citizen user.

In looking to the future, Wilmer, who previously worked for the Department of Defense, believes that, despite critiques that say the opposite, the government can actually be incredibly forward leaning in its modernization and innovation practices. Specifically, he referenced biosurveillance initiatives where existing data is being exploited in productive ways. Such practices might, Wilmer said, even help in the mission to predict or identify the next pandemic before its too late.

See original here:

Legacy Overhaul & Data Mining at Top of Mind for Health Care ... - GovCon Wire

Read More..