Page 40«..1020..39404142..5060..»

Domain-Driven Cloud: Aligning your Cloud Architecture to your … – InfoQ.com

Key Takeaways

Domain-Driven Cloud (DDC) is an approach for creating your organizations cloud architecture based on your business model. DDC uses the bounded contexts of your business model as inputs and outputs a flexible cloud architecture to support all of the workloads in your organization and evolve as your business changes. DDC promotes team autonomy by giving teams the ability to innovate within guardrails. Operationally, DDC simplifies security, governance, integration and cost management in a way that promotes transparency for IT and business stakeholders alike.

Based on Domain-Driven Design (DDD) and the architecture principle of high cohesion and low coupling, this article introduces DDC including the technical and human benefits of aligning your cloud architecture to the bounded contexts in your business model. You will learn how DDC can be implemented in cloud platforms including Amazon Web Services (AWS) and Microsoft Azure while aligning with their well-architected frameworks. Using illustrative examples from one of our real customers, you will learn the 5 steps to implementing DDC in your organization.

DDC extends the principles of DDD beyond traditional software systems to create a unifying architecture spanning business domains, software systems and cloud infrastructure.

Our customers perpetually strive to align "people, process and technology" together so they can work in harmony to deliver business outcomes. However, in practice, this often falls down as the Business (Biz), IT Development (Dev) and IT Operations (Ops) all go to their separate corners to design solutions for complex problems that actually span all three.

What emerges is business process redesigns, enterprise architectures and cloud platform architecture all designed and implemented by different groups using different approaches and localized languages.

Whats missing is a unified architecture approach using a shared language that integrates BizDevOps. This is where DDC steps in, with a specific focus on aligning the cloud architecture and software systems that run on them to the bounded contexts of your business model, identified using DDD. Figure 1 illustrates how DDC extends the principles of DDD to include cloud infrastructure architecture and in doing so creates a unified architecture that aligns BizDevOps.

[Click on the image to view full-size]

In DDC, the most important cloud services are AWS Organizational Units (OUs) that contain Accounts and Azure Management Groups (MGs) that contain Subscriptions. Because 100% of the cloud resources you secure, use and pay for are connected to Accounts and Subscriptions, these are the natural cost and security containers. By enabling management and security at the higher OU/MG level and anchoring these on the bounded contexts of your business model, you can now create a unifying architecture spanning Biz, Dev and Ops. You can do this while giving your teams flexibility in how they use Accounts and Subscriptions to meet specific requirements.

The benefits for aligning your cloud architecture to your organizations business model include:

DDC may not be the best approach in all situations. Alternatives such as organizing your cloud architecture by tenant/customer (SaaS) or legal entity are viable options, too.

Unfortunately, we often see customers default to organizing their cloud architecture by their current org structure, following Conways Law from the 1960s. We think this is a mistake and that DDC is a better alternative for one simple reason: your business model is more stable than your org structure.

One of the core tenets of good architecture is that we dont have more stable components depending on less stable components (aka the Stable Dependencies Principle). Organizations, especially large ones, like to reorganize often, making their org structure less stable than their business model. Basing your cloud architecture on your org structure means that every time you reorganize your cloud architecture is directly impacted, which may impact all the workloads running in your cloud environment. Why do this? Basing your cloud architecture on your organizations business model enables it to evolve naturally as your business strategy evolves, as seen in Figure 2.

[Click on the image to view full-size]

We recognize that, as Ruth Malan states, "If the architecture of the system and the architecture of the organization are at odds, the architecture of the organization wins". We also acknowledge there is work to do with how OUs/MGs and all the workloads within them best align to team boundaries and responsibilities. We think ideas like Team Topologies may help here.

We are seeing todays organizations move away from siloed departmental projects within formal communications structures to cross-functional teams creating products and services that span organizational boundaries. These modern solutions run in the cloud, so we feel the time is right for evolving your enterprise architecture in a way that unifies Biz, Dev and Ops using a shared language and architecture approach.

Both AWSs Well-Architected framework and Azures Well-Architected framework provide a curated set of design principles and best practices for designing and operating systems in your cloud environments. DDC fully embraces these frameworks and at SingleStone we use these with our customers. While these frameworks provide specific recommendations and benefits for organizing your workloads into multiple Accounts or Subscriptions, managed with OUs and MGs, they leave it to you to figure out the best taxonomy for your organization.

DDC is opinionated on basing your cloud architecture on your bounded contexts, while being 100% compatible with models like AWSs Separated AEO/IEO and design principles like "Perform operations as code" and "Automatically recover from failure". You can adopt DDC and apply these best practices, too. Tools such as AWS Landing Zone and Azure Landing Zones can accelerate the setup of your cloud architecture while also being domain-driven.

Do you think a unified architecture using a shared language across BizDevOps might benefit your organization? While a comprehensive list of all tasks is beyond the scope of this article, here are the five basic steps you can follow, with illustrations from one of our customers who recently migrated to Azure.

The starting point for implementing DDC is a set of bounded contexts that describes your business model. The steps to identify your bounded contexts are not covered here, but the process described in Domain-Driven Discovery is one approach.

Once you identify your bounded contexts, organize them into two groups:

To illustrate, lets look at our customer who is a medical supply company. Their domain and technical contexts are shown in Figure 3.

[Click on the image to view full-size]

Your organizations domain contexts would be different, of course.

For technical contexts, the number will depend on factors including your organizations industry, complexity, regulatory and security requirements. A Fortune 100 financial services firm will have more technical contexts than a new media start-up. With that said, as a starting point DDC recommends six technical contexts for supporting all your systems and data.

You dont have to create these all up-front, start with Cloud Management initially and build out as-needed.

WIth your bounded contexts defined, its now time to build a secure cloud foundation for supporting your organizations workloads today and in the future. In our experience, we have found it is helpful to organize your cloud capabilities into three layers based on how they support your workloads. For our medical supply customer, Figure 4 shows their contexts aligned to Application, Platform and Foundation layers of their cloud architecture.

[Click on the image to view full-size]

With DDC, you align AWS Organizational Units (OUs) or Azure Management Groups (MGs) to bounded contexts. By align, we mean you name them after your bounded contexts. These are the highest levels of management and through the use of inheritance they give you the ability to standardize controls and settings across your entire cloud architecture.

DDC gives you flexibility in how best to organize your Accounts and Subscription taxonomy, from coarse-grained to fine-grained, as seen in Figure 5.

DDC recommends starting with one OU/MG and at least two Accounts/Subscriptions per bounded context. If your organization has higher workload isolation requirements, DDC can support this too, as seen in Figure 5.

[Click on the image to view full-size]

For our customer who had a small cloud team new to Azure, separate Subscriptions for Prod and NonProd for each context made sense as a starting point, as shown in Figure 6.

[Click on the image to view full-size]

Figure 7 shows what this would look like in AWS.

[Click on the image to view full-size]

For our customer, further environments like Dev, Test and Stage could be created within their respective Prod and Non-Prod Subscriptions. This provides them isolation between environments with the ability to configure environment-specific settings at the Subscription or lower levels. They also decided to build just the Prod Subscriptions for the six technical contexts to keep it simple to start. Again, if your organization wanted to create separate Accounts or Subscriptions for every workload environment, this can be done too and still aligned with DDC.

From a governance perspective, in DDC we recommend domain contexts inherit security controls and configurations from technical contexts. Creating a strong security posture in your technical contexts enables all your workloads that run in domain contexts to inherit this security by default. Domain contexts can then override selected controls and settings on a case-by-case basis balancing team autonomy and flexibility with required security guardrails.

Using DDC, your organization can grant autonomy to teams to enable innovation within guardrails. Leveraging key concepts from team topologies, stream-aligned teams can be self-sufficient within domain contexts when creating cloud infrastructure, deploying releases and monitoring their workloads. Platform teams, primarily working in technical contexts, can focus on designing and running highly-available services used by the stream-aligned teams. These teams work together to create the right balance between centralization and decentralization of cloud controls to meet your organizations security and risk requirements, as shown in Figure 8.

[Click on the image to view full-size]

As this figure shows, policies and controls defined at higher level OUs/MGs are enforced downwards while costs and compliance are reported upwards. For our medical supply customer, this means their monthly Azure bill is automatically itemized by their bounded contexts with summarized cloud costs for Orders, Distributors and Payers to name a few.

This makes it easy for their CTO to share cloud costs with their business counterparts and establish realistic budgets that can be monitored over time. Just like costs, policy compliance across all contexts can be reported upwards with evidence stored in the Compliance technical context for auditing or forensic purposes. Services such as Azure Policy and AWS Audit Manager are helpful for continually maintaining compliance across your cloud environments by organizing your policies and controls in one place for management.

With a solid foundation and our bounded contexts identified, the next step is to align your workloads to the bounded contexts. Identifying all the workloads that will run in your cloud environment is often done during a cloud migration discovery, aided in part by a change management database (CMDB) that contains your organizations portfolio of applications.

When aligning workloads to bounded contexts we prefer a workshop approach that promotes discussion and collaboration. In our experience this makes DDC understandable and relatable by the teams involved in migration. Because teams must develop and support these workloads, the workshop also highlights where organizational structures may align (or not) to bounded contexts. This workshop (or a follow-up one) can also identify which applications should be independently deployable and how the teams ownership boundaries map to bounded contexts.

For our medical supply customer, this workshop revealed the permissions required for a shared CI/CD tool in the Shared Services context was needed to deploy a new version of their Order Management system in the Orders context. This drove a discussion on working out how secrets and permissions would be managed across contexts, identifying new capabilities needed for secrets management that were prioritized during cloud migration. By creating a reusable solution that worked for all future workloads in domain contexts, the cloud team created a new capability that improved the speed of future migrations.

Figure 9 summarizes how our customer aligned their workloads to bounded contexts, which are aligned to their Azure Management Groups.

[Click on the image to view full-size]

Within the Order context, our customer used Azure Resource Groups for independently deployable applications or services that contain Azure Resources, as shown in Figure 10.

[Click on the image to view full-size]

This design served as a starting point for their initial migration of applications running in a data center to Azure. Over the next few years their goal was to re-factor these applications into multiple independent micro-services. When this time came, they could incrementally do this an application at a time by creating additional Resource Groups for each service.

If our customer were using AWS, Figure 10 would look very similar but use Organizational Units, Accounts and AWS Stacks for organizing independently deployable applications or services that contained resources. One difference in cloud providers is that AWS allows nested stacks (stacks within stacks) whereas Azure Resource Groups cannot be nested.

For networking, in order for workloads running in domain contexts to access shared services in technical contexts, their networks must be connected or permissions explicitly enabled to allow access. While the Network technical context contains centralized networking services, by default each Account or Subscription aligned to a domain context will have its own private network containing subnets that are independently created, maintained and used by the workloads running inside them.

Depending on the total number of Accounts or Subscriptions, this may be desired or it may be too many separate networks to manage (each potentially has their own IP range). Alternatively, core networks can be defined in the Network Context and shared to specific domain or technical contexts thereby avoiding every context having its own private network. The details of cloud networking are beyond the scope of this article but DDC enables multiple networking options while still aligning your cloud architecture to your business model. Bottom line: you dont have to sacrifice network security to adopt DDC.

Now that we have identified where each workload will run, it was time to begin moving them into the right Account or Subscription. While this was a new migration for our customer (greenfield), for your organization this may involve re-architecting your existing cloud platform (brownfield). Migrating a portfolio of workloads to AWS or Azure and the steps for architecting your cloud platform is beyond the scope of this article, but with respect to DDC this is a checklist of the key things to keep in mind:

For brownfield deployments of DDC that are starting with an existing cloud architecture, the basic recipe is:

Your cloud architecture is not a static artifact, the design will continue to evolve over time as your business changes and new technologies emerge. New bounded contexts will appear that require changes to your cloud platform. Ideally much of this work is codified and automated, but in all likelihood you will still have some manual steps involved as your bounded contexts evolve.

Your Account / Subscription taxonomy may change over time too, starting with fewer to simplify initial management and growing as your teams and processes mature. The responsibility boundaries of teams and how these align to bounded contexts will also mature over time. Methods like GitOps work nicely alongside DDC to keep your cloud infrastructure flexible and extensible over time and continually aligned with your business model.

DDC extends the principles of DDD beyond traditional software systems to create a unifying architecture spanning business domains, software systems and cloud infrastructure (BizDevOps). DDC is based on the software architecture principle of high cohesion and low coupling that is used when designing complex distributed systems, like your AWS and Azure environments. Employing the transparency and shared language benefits of DDD when creating your organizations cloud architecture results in a secure-yet-flexible platform that naturally evolves as your business changes over time.

Special thanks to John Chapin, Casey Lee, Brandon Linton and Nick Tune for feedback on early drafts of this article and Abby Franks for the images.

Original post:
Domain-Driven Cloud: Aligning your Cloud Architecture to your ... - InfoQ.com

Read More..

50 programs that fix Microsoft Windows problems fast | PCWorld – PCWorld

No matter how much experience you have with Microsofts Windows, it can still be improved by turning to software and tools that can make the operating system that much better.

Take Windows 11, for example: When Microsoft introduced it with extremely strict system requirements in autumn 2021, it was only a matter of time before those barriers could be circumvented.

In order to install the new operating system on older PCs, the registry first had to be changed manually. Later, this could be simplified with a batch file, and now even that is superfluous thanks to Rufus, a small tool for creating bootable USB sticks. Now, with just a few additional mouse clicks, you can run Windows 11 on almost any computer.

Lets start this look at helpful, dead simple Windows software with that very program, before diving deeper into several different categories.

If youd prefer to wade into a deep, powerful program, check out our guide to Microsoft Sysinternals, the best Windows troubleshooting tool.

Download the installation file for Windows 11 from Microsoft via the option Download a Windows 11 disk image (ISO).

Now insert a flash drive with at least 8GB of storage space into the computer. Start Rufus, click on Select on the interface, select the Windows 11 ISO file, and then click Start. The Customize installation dialogue then appears, giving you an option to avoid Windows 11s obligation to set up an online account. Follow the setup process until Windows 11 is installed.

This Rufus-loaded flash drive will not only reinstall Windows 11 on any PC, but also upgrade any Windows 10 installation via the setup.exe file!

If you want to move an older Windows 10 system 1:1 to a new PC, we recommend Easeus Todo Backup. With it, you create an image of your old computers storage, from which you restore your system with all settings, programs, and data on your new PC. In addition, Windows 10 can then be upgraded to Windows 11 without any problems.

To install an older Windows version for example Windows 8.1, 10, or 11 version 21H2 save it to your storage with Windows ISO Downloader and create a setup flash drive from it with Rufus.

Depending on the Windows version, the hardware and the history of the PC, you may need a product key to activate the operating system for the new installation. You can read this key on your existing system with Showkeyplus.

Reset Windows Update Tool solves various update problems: Almost 20 features are available after starting the program with administrator rights.

If you have installed several versions of Windows, Linux, or other operating systems on your computer, you can use Easy BCD to adjust the boot entries and their prioritization.

Creating a flash drive as a multiboot system for booting different live systems was a complex matter for a long time. Ventoy fundamentally changes that. With this tool, all you have to do is make the flash drive bootable by clicking on Install and then simply save the ISO files on the stick within Windows. After booting from the flash drive, you select the desired live system via the Ventoy interface. The key benefit here is you dont need to create a new bootable flash drive when a new system version appears, but simply replace the older one with the new ISO file. Another plus: You can continue to use free space on the USB stick to save and transport your data.

Glary Utilities makes both problem analysis and their elimination possible with a mouse click, everything else is done automatically by the software in the background.

IDG

The promise of 1-click maintenance is hit or miss on the PC. The causes of possible errors are too varied and the solutions too complex. That said, Ccleaner and Glary Utilities are always worth a try. You can start the system analysis and the subsequent problem elimination with just one mouse click.

Bootracer requires a few more clicks. The program analyzes the start-up process and breaks it down into individual segments. This shows at a glance which process or autostart program has a problem. You can choose to start the boot analysis as a normal complete Windows start, or limited to the system without autostarting software. The wizards make it easy to use Bootracer, including the necessary restarts. You can see in the details where and why your start-up may take an unusually long time. These limitations help to get to the bottom of the cause or, if it makes sense, to exclude the software in question from autostarting when Windows loads.

The analysis tools Hwinfo and Speccy show whether something is wrong with your hardware. Both programs provide a wealth of information and sensor measurement data. Even more information on your processor is provided by CPU-Z and Core Temp, while GPU-Z digs deep into graphics card details. Unknown Device Identifier is helpful in identifying unknown components; the tool shows many more components than the native Windows device manager.

Memtest86 tests the main memory for errors, while Crystaldiskinfo analyzes SSDs and magnetic hard drives by reading out the SMART parameters. A look at the overall status shows whether everything is OK. The tool sounds an alarm in the event of abnormal values, which is very important for drives full of personal data.

Snappy Driver Installer recognises outdated hardware drivers and automatically updates them with the latest versions if desired.

IDG

Snappy Driver Installer checks whether your installed hardware drivers are up-to-date. The tool starts without installation. Click on the option Download indices only and wait until the system analysis is complete. Now, if desired, activate the Restore point field. To replace all obsolete drivers, continue by clicking on Select all Install at the top left. Alternatively, check the obsolete entries individually. Due to the file sizes of some drivers, downloading and installing may take some time.

Windows Explorer and the desktop are always used on a computer even if mostly unconsciously. While Microsoft has equipped the file manager with tabs in Windows 11, you have to retrofit the file explorer tabs into Windows 10.

To do this, install Qttabbar, restart the PC, open Explorer, and click on the down arrow in the View tab under the Options symbol on the right. There, you activate the list entry Qttabbar to show the new tab bar.

The free version of Tidy Tabs allows up to three tabs in a window, across programs for example, an Excel sheet, a Word document, and Outlook. It makes it easier to view pictures with a large preview that you start and close by pressing the space bar.

Treesize Free shows you at a glance which files are eating up space on your hard drive. The already mentioned Ccleaner also offers a quick search function for duplicate files (Extras - Duplicate Finder). Other tools like Anti-Twin offer more options, but are more complicated to use.

Small intervention, big effect. Capslock Goodbye prevents you once and for all from accidentally pressing the Caps Lock key again.

IDG

Clipboard Master can do more and is more convenient than the Windows clipboard. If youre annoyed by accidental clicks of the caps lock key, Capslock Goodbye is the tool for you. The software deactivates the key or assigns an alternative function to it. Desktop OK restores the icon placement on the desktop if it has been mixed up.

Three other programs provide more order (and more space on the hard disk): Should I Remove it shows which pre-installed software you can safely delete on new computers. O&O Appbuster makes it easier to remove Windows apps that hardly anyone needs. Unchecky prevents the secret installation of unwanted programs and toolbars. Its a godsend.

Remote help, i.e. taking over a remote computer via the Internet, is not only efficient, but also very simple with just one click in TeamViewer Quicksupport. The person who needs help starts the tool and gives his or her displayed ID and password to the helper. The helper takes care of the rest with the fully comprehensive Teamviewer software, which is also free for private use.

Changes to hard drives and partitions have a profound effect on your system and are often not easy to reset. Smaller tools can cause much less damage: Superdiskformatter, for example, allows you to change the file system (FAT32, NTFS, etc.) except for the Windows partition. Fat32formatter formats almost arbitrarily large data media as a FAT32 system, and Drive Letter Changer is used to assign fixed drive letters to USB drives.

Raidrive assigns a drive letter to cloud storage devices such as Dropbox, Google Drive, and OneDrive for quick 1-click access in Explorer. To do this, select a storage service and a letter via Add, log in with your login data and allow Raidrive to access the cloud. Done!

Cloudevo combines several cloud storages under Windows into one drive. This is convenient and allows you to store even oversized files on the internet.

IDG

Cloudevo also simplifies the handling of online storage by combining diverse cloud storage pools into a single drive with theoretically unlimited capacity. This is also possible with several free accounts from the same provider. While you as a user only see your Cloudevo drive, the service behind it automatically distributes the stored data to the various cloud storage providers.

Drive pooling with local drives, for example with several USB drives, can be done with Liquesce. Data that is too large to be sent by email can be forwarded with O&O FileDirect. The software creates an access link to your PC, which the recipient can use to transfer the shared data. Your computer must be switched on and online during the transfer.

If Windows blocks access to certain files, thus preventing deletion, renaming, or copying, Lock Hunter releases them again. You can change other permissions of folders and files with Attribute Changer.

Have I been pwned? and HPI Identity Leak Checker are not programs to install, but a simple mouse click is enough here too. Just type in your email address and youll know whether your account has been affected by one of the countless account and password hacks. If so, please be sure to change the corresponding password!

Defender UI provides a new, easier-to-use interface for all Microsoft security features than the integrated Windows Defender offers. The software contains four predefined security profiles and clearly groups together many security features and settings otherwise scattered throughout the operating system.

Virustotal Uploader simplifies the process of uploading potentially dangerous files to the Virustotal scanning platform. Instead of manually calling up the website, selecting the file. and uploading it, this free app works much faster via the Windows context menu. The browser plug-in I dont care about cookies eliminates the hassle of deselecting cookies on many websites by blocking or hiding the usual pop-up dialogues. It is best to combine the add-on with the automatic deletion of all cookies when you close your browser.

Wipe is suitable as a supplement for removing online traces. The software not only deletes browser data, but also temporary files and more.

You can securely delete data from your hard drive with Eraser: The name says it all.

Ungoogled Chromium is a special fork of the free Chromium browser, on which Google Chrome is also based. Unlike Chrome, Ungoogled Chromium does without any Google services for more privacy.

Simple Code Generator creates QR codes for private information that you wouldnt want to entrust to an online QR generator, such as mail addresses, Wi-Fi access information, Outlook or personal contacts, and the like.

Last but not least is USB-Logon. It lets you create a USB stick for fast and secure Windows logon without a password. USB-Logon is a good alternative for PCs without a Windows Hello-enabled webcam or fingerprint sensor.

The Microsoft Powertoys tool collection has grown to over 20 amazingly helpful features, many of them with (almost) one-click operation. For example, Always on top keeps any program window visible in the monitor at all times; Awake switches off the power settings for a certain time; FancyZones allows multiple windows to be easily arranged even under Windows 10; the File Explorer Add-Ons show the contents for various file formats as a large preview; and Image Resizing changes the size of photos simply via the context menu. Also via the context menu, PowerRename functions allow automatic renaming of files.

Because we had to wait so long for it, the new PowerToys feature Paste as plain text is downright ingeniously simple. The keyboard shortcut Ctrl-Windows-Alt-V inserts the content stored in the clipboard unformatted into any program ideal, for example, for quickly transferring web content into word processing.

This article has been translated from German to English and originally appeared on pcwelt.de.

Read more from the original source:
50 programs that fix Microsoft Windows problems fast | PCWorld - PCWorld

Read More..

Why IT leaders should deploy generative AI infrastructure now – TechTarget

In the past several months, rampant excitement about the potential benefits of generative AI technology has increased the technology's priority status across enterprise organizations worldwide.

According to a recent research report from TechTarget's Enterprise Strategy Group, "Beyond the GenAI Hype: Real-world Investments, Use Cases, and Concerns," 42% of organizations said they are in a generative AI proof of concept if they haven't already deployed it in production. Our research showed that generative AI ranks higher than cloud in overall strategic business initiatives, which highlights how critical these projects are now.

In other words, the adoption rate for generative AI projects is expected to be massive and unlike anything we've seen from enterprise technology. And as a result, there is a high likelihood that your own executive team is currently in a conflicted state: They are excited about the potential productivity benefits of generative AI, but they are concerned about the risks to data privacy.

Regardless of the pace of AI adoption within your own organization, the expected overall adoption rate means that if your organization lags in adopting and deploying generative AI products, your competition will gain an increased advantage.

Organizations need to move quickly when it comes to generative AI, but they should do so in a manner that enables them to start small, scale quickly and mitigate risk associated with data privacy, compliance and security. With that necessity in mind, Nutanix has introduced GPT-in-a-Box.

The product combines the following elements:

There is a lot to like in this packaging, but most important is its simplicity of design. Nutanix is known for simplicity, which is a hallmark of its HCI technology.

Overall, there is likely going to be a longer than usual "crawl" phase before you get to "run" with generative AI within your organization. But if you want to get a leg up on generative AI initiatives, don't waste time trying to deploy the perfect infrastructure for what the ideal use will be in three to five years. In fact, few -- if any -- organizations truly have a strong grasp on what the ideal use will be.

We do have a sense of what those uses will look like in general. According to Enterprise Strategy Group's generative AI research report, the more commonly identified uses improve productivity, efficiency and the overall customer experience.

As a result, organizations should seek to speed up infrastructure deployment to enable their data science teams to get started on identifying the right data and models. As an example, the Nutanix product enables organizations to start quickly, and it gives them the flexibility to scale and adapt as needed.

The ability to deploy the product on premises is also important. While public cloud services will likely support most generative AI products, a separate Enterprise Strategy Group research study, "Multi-cloud Application Deployment and Decision Making," found that 29% of organizations identified AI/machine learning workloads as not being candidates for cloud deployment.

Some of those organizations will be launching AI initiatives that will use sensitive data or data sets with privacy concerns. Or maybe the data and compute requirements in the cloud are simply too costly for organizations just getting started. According to the multi-cloud research report, the cost of low-latency performance in the cloud is the most common reason organizations decided that an on-premises workload is not a candidate for the public cloud.

Ultimately, when it comes to generative AI, speed is of the essence. And given the increased executive-level priority on generative AI workloads, IT leaders must be proactive.

GPT-in-a-Box is simple to use and flexible, but Nutanix is not the only provider that has announced a strengthened Nvidia partnership and a product for generative AI. Always evaluate your options.

See the original post here:
Why IT leaders should deploy generative AI infrastructure now - TechTarget

Read More..

Bitcoin clean energy usage reportedly exceeds 50% Will Tesla start accepting BTC payments? – Cointelegraph

Elon Musk said in 2021 that Tesla would accept Bitcoin payments once miners were using roughly 50% clean energy sources with positive future trend a benchmark that may have recently been met.

In a Sept. 14 thread on X (formerly Twitter), Bloomberg analyst Jamie Coutts reported the percentage of Bitcoin (BTC)mining energy coming from renewable sources had exceeded 50% with falling emissions plus a dramatically rising hash rate. According to Coutts, the push toward renewable energy sources was the result of miners dispersing from China in the wake of the countrys mining ban starting in 2021, as well as certain nations turning to mining to monetize stranded and excess energy.

Countries investing in BTC mining include El Salvador which has also recognized the cryptocurrency as legal tender since 2021 Bhutan, Oman and the United Arab Emirates. The 50% energy benchmark could mean a greater move toward adoption by one of the biggest companies in the world.

Related: Teslas diamond hands: EV maker's Bitcoin holdings see no change in Q2

Musk the CEO of Tesla, owner of X and founder of SpaceX announced Tesla would stop accepting BTC payments in May 2021, citing the rapidly increasing use of fossil fuels for Bitcoin mining and transactions at the time. Since establishing a sustainable energy source threshold of 50% for when the firm would resume payments, Musk acknowledged that there was a positive trend toward green energy sources but hasnt changed Teslas policy.

The Tesla CEO did not appear to have publicly announced any move to resume BTC payments. At the time of publication, the price of Bitcoin was $26,572, having risen more than 2% in the last seven days.

Magazine: Bitcoin is on a collision course with Net Zero promises

Continue reading here:
Bitcoin clean energy usage reportedly exceeds 50% Will Tesla start accepting BTC payments? - Cointelegraph

Read More..

Weekly Market Wrap: Deutsche Banks crypto move propels Bitcoin to US$26,750 – Yahoo Finance

Bitcoin rose 3.08% from Sept 8 to Sept. 15, to US$26,625 as of 6:45 p.m. Friday in Hong Kong. The worlds largest cryptocurrency by market capitalization has been trading below US$30,000 since Aug. 9, according to CoinMarketCap data. Ether, the worlds second-largest cryptocurrency, rose 0.21% over the week to US$1,628.

jMw 9kUraSJLsrQPc3yHp8nKD71wWmI1zUWMpAbn6qpkqxN8SGD8 lYrZQq5dnJCdrDm4sjunHkIGlu7KhFA4aLjqka cOKsgH3HcE0xIJAzGvm9QeftG7Fk7lozU6

German banking giant Deutsche Bank partnered with Swiss crypto firm Taurus to offer Bitcoin and crypto custody solutions to institutional clients, the Swiss firm announced on Thursday. This means that for the first time, the US$1.3 trillion asset manager will be able to hold a limited amount of crypto on behalf of clients and offer tokenized versions of traditional financial assets.

Bitcoin rose to a weekly high of US$26,750 on Friday, bolstered by the announcement from Germanys largest lender, according to Phillip Lord, president of the crypto payment app Oobit.

The trend towards more product launches and more geographical diversity in relation to cryptocurrencies is a fact, it is happening, whether in the Lions City, El Salvador, Germany, or the U.S.

Markets always do what they are poised to do, but never when. Hence, while we are optimistic about seeing the US$30,000 barrier soon, we wouldnt make a clear projection that this would happen in the second half of September, added Lord.

Last Friday, the U.S. Securities and Exchange Commission (SEC) appealed Julys summary judgment that said Ripples XRP sales to institutional investors violated securities laws, but sales on public exchanges to retail investors did not.

Image: Flickr

It seems that the SEC is quite unhappy with the summary judgment and is trying to exhaust all means to get a ruling in its favor, Jonas Betz, crypto market analyst and founder of consultancy firm Betz Crypto, told Forkast.

It is a common legal procedure to try to challenge decisions, but in my opinion, it will come to nothing in this case. The XRP token may see higher volatility in the coming weeks, but a broad decline in investor sentiment is unlikely.

Story continues

While the SECs appeal didnt come as a surprise, investor confidence took a hit, with Bitcoin falling to a weekly low of US$25,060 on Monday, three days after the agencys legal action.

Tuesday brought positive developments for investors, after Standard Chartereds crypto custody arm, Zodia Custody, launched services in Singapore for financial institutions.

Standard Chartereds move indicates growing institutional acceptance towards crypto, according to Manuel Ferrari, the co-founder of Money On Chain, the first Bitcoin-backed stablecoin protocol on Rootstock.

This move could potentially signal the start of a growing trend for more large institutions to enter the market. As one of the worlds leading financial institutions, Standard Chartereds entry into the crypto space lends credibility and legitimacy to digital assets, wrote Ferrari, in a statement shared with Forkast.

Image: Standard Chartered

The same day, Franklin Templeton, a holding company with US$1.52 trillion in assets under management, filed for a spot Bitcoin exchange-traded fund (ETF) application. This only brought temporary relief for investors, considering that the SEC delayed the decision on several such ETF applications, including the ones from BlackRock and WisdomTree.

Two days after the news, Bitcoin recovered to US$26,529 on Thursday, which could pave the way to more upside momentum in September, according to Kadan Stadelmann, chief technical officer of blockchain infrastructure development firm Komodo.

There is a growing sentiment that Bitcoin could rally back above US$30,000 in the coming month, wrote Stadelmann.

However, Ferrari expects Bitcoins recovery to be short-lived.

The recent rebound in Bitcoins price has set the stage for a temporary bounce in price, likely to the US$28,000 level. That will likely be short-lived, however, as Bitcoin is likely to experience further downward pressure in the coming months, wrote Ferrari.

On the macroeconomic front, the release of the U.S. consumer price index (CPI) showed that inflation posted its biggest monthly increase this year, rising 0.6% for August and 3.7% from a year ago.

Image: Envato Elements

Bitcoin Cash was this weeks biggest gainer in the top 100, rising 13.23% to US$217.14. The token started picking up pace on Tuesday as the wider crypto investor sentiment was improved by the launch of Standard Chartereds crypto custody wing.

Rune, the native governance token of the ThorChain network, was this weeks second-biggest gainer, rising 11.53% to US$1.75. The coin started picking up momentum on Wednesday and has been receiving increased investor interest since lending went live on the protocol on Aug. 21.

See related article: Grayscale wins against SEC as India moves on blockchain; Friend.tech loses friends

Bitcoins double-bottom technical formation, which printed its first leg down on June 15 and the second one this week, is a bullish sign for the short term, according to Lucas Kiely, the chief investment officer of digital asset platform Yield App.

While Bitcoin is likely to trade lower in the coming months, the double bottom signals a short-term bullish trend for Bitcoin. If Bitcoin manage to close the week above the resistance of approximately US$25,000, it signals strong short-term support, wrote Kiely, adding that Bitcoin could see considerable bullish momentum if it returned to US$30,000 in September.

In the macroeconomy, investors will be looking forward to the Federal Reserves next interest rate decision on Wednesday. The CME FedWatch Tool predicts a 97% chance the central bank will maintain the current rate unchanged in September, up from 92% one week ago. It gives a 67.2% chance for another pause in November.

See related article:Indias G20 Presidency, Blockchain Week & Singapores new President

Follow this link:
Weekly Market Wrap: Deutsche Banks crypto move propels Bitcoin to US$26,750 - Yahoo Finance

Read More..

FTX Gets Court Approval to Sell Billions in Bitcoin, Ethereum and Solana – Decrypt

Collapsed digital asset exchange FTX was today given the green light to sell billions in crypto assets by the judge overseeing its bankruptcy proceedings.

Judge John Dorsey on Wednesday approved that the defunct crypto brand can now sell $3.4 billion in Solana, Ethereum, Bitcoin, and other assets at the U.S. Bankruptcy Court for the District of Delaware.

The companys plan for offloading the assets, first outlined in August, will appoint Mike Novogratzs Galaxy Digital as the investment manager overseeing the sale. According to the plan, FTX will cap its selling at $100 million worth of tokens per week, a limit that could be increased to $200 million on an individual token basis.

Judge Dorsey will allow FTX to raise its weekly maximum if the company gets written authorization from the court. But a footnote on the order clarifies that sales of Bitcoin, Ethereum, stablecoins, and the redemption of stablecoins will notcount towards the $100 million weekly limit. Calculation of the limit will also exclude transactions made to bridge tokens from non-native blockchains back to their native networks.

FTX quickly and unexpectedly went bankrupt last November due to alleged criminal mismanagement.

Billions of dollars in customer cash disappeared and now the exchanges new management now is working to pay back creditors. Selling these assets will help plug the hole, which originally stood at $7 billion.

A Monday court filing showed that FTX owns $1.16 billion in Solana (SOL), $560 million in Bitcoin (BTC), $192 million in Ethereum (ETH), and $137 million in Aptos (APT). The crypto prices in the court document are based on pricing from August 31.

Some $800 million in cash and public equity has already been recovered.

Ex-CEO and co-founder of FTX Sam Bankman-Fried is awaiting a massive criminal trial in October after his crypto behemoth went bust last year.

Feds arrested and hit the fresh-faced ex-Jane Street trader and MIT graduate with 13 criminal charges, including wire fraud, securities fraud, conspiracy to commit bank fraud, and defrauding the Federal Election Commission.

Originally posted here:
FTX Gets Court Approval to Sell Billions in Bitcoin, Ethereum and Solana - Decrypt

Read More..

Binance CEO Issues Frank Warning As Fears Swirl Of An Imminent Bitcoin, Ethereum And Crypto Price Crash – Forbes

BitcoinBTC, ethereum and crypto are teetering on the brink of disaster, with market watches warning of a looming price crash.

Subscribe now to Forbes' CryptoAsset & Blockchain Advisor and successfully navigate the bitcoin and crypto market rollercoaster

The bitcoin price, which has lost momentum after rocketing higher through the first half of this year, has printed an ominous "death cross" pattern along with the ethereum price.

Now, after the chief executive of Coinbase revealed an "important" bitcoin update this week, Binance CEO Changpeng "CZ" Zhao has issued a "frank" warning over disappearing "fiat ramps" that could weigh on the entire bitcoin, ethereum and crypto market.

It's at the start of a bull run you need up-to-date information the most! Sign up now for the free CryptoCodexA daily newsletter for traders, investors and the crypto-curious that will keep you ahead of the market

Appearing at a Singapore crypto conference, CZ was asked what the biggest challenges would be in bringing the next 100 million users into the bitcoin, ethereum and crypto market.

"Today, to be very frank, it's actually fiat ramps," CZ said in comments reported by Insider, referring to how people move money from traditional banks to crypto exchanges. "With tightening regulations in the earlier part of this year, we're seeing a lot of traditional institutions that used to provide fiat ramp channels pull away."

A U.S. banking crisis earlier this year that forced the closing of crypto-friendly Silvergate, Signature and Silicon Valley banks has pushed many exchanges and crypto companies offshore in search of banking partners.

Despite Wall Street giants like BlackRockBLK and Fidelity expanding their bitcoin and crypto services, many banks are increasingly unwilling to do business with crypto companies that have had their reputations tarnished by the bitcoin and crypto market price crash and implosion of major exchange FTX.

The traditional financial service sector pull-back from the crypto market has been branded "Operation Choke Point 2.0" by some in the crypto industry who fear it's been directed by the U.S. government and regulators. The original 2013 Operation Choke Point was a U.S. Department of Justice initiative to discourage banks from working with firearm dealers, payday lenders, and other companies believed to be at a high risk for fraud and money laundering.

Sign up now for CryptoCodexA free, daily newsletter for the crypto-curious

Meanwhile, the Securities and Exchange Commission (SEC) has been pursuing a campaign of heavy-handed enforcement action against crypto companies, including Binance, the world's largest crypto exchange by volume.

In June, the SEC sued Binance, its U.S. arm and rival U.S. platform Coinbase, alleging they had violated securities rules.

Binance.US's chief executive Brian Shroder abruptly departed the company this week, quitting at the same time as the exchange axed one-third of its staff. Just days later, Krishna Juvvadi, head of legal, and Sidney Majalya, chief risk officer, left the company, the Wall Street Journal reported, citing anonymous sources.

Last week, a top Federal Reserve official, Michael Barr, has warned he's "deeply concerned" about the $120 billion stablecoin market that's exploded over the last few yearswhich is closely linked to the price of bitcoin, ethereum and other major cryptocurrencies.

I am a journalist with significant experience covering technology, finance, economics, and business around the world. As the founding editor of Verdict.co.uk I reported on how technology is changing business, political trends, and the latest culture and lifestyle. I have covered the rise of bitcoin and cryptocurrency since 2012 and have charted its emergence as a niche technology into the greatest threat to the established financial system the world has ever seen and the most important new technology since the internet itself. I have worked and written for CityAM, the Financial Times, and the New Statesman, amongst others. Follow me on Twitter @billybambrough or email me on billyATbillybambrough.com.Disclosure: I occasionally hold some small amount of bitcoin and other cryptocurrencies.

Visit link:
Binance CEO Issues Frank Warning As Fears Swirl Of An Imminent Bitcoin, Ethereum And Crypto Price Crash - Forbes

Read More..

Bitcoin Critique: ‘Black Swan’ Author Courts Controversy With Provocative Take – Bitcoinist

Black Swan author Nassim Nicholas Taleb unleashed a barrage of criticism directed at Bitcoin, particularly targeting its commonly touted advantage: a finite supply of 21 million coins.

Talebs comments have created a stir within the cryptocurrency community and prompted a closer examination of Bitcoins intrinsic value.

On the social media platform X, Taleb minced no words, lambasting what he termed bitdiots individuals who believe that the mere scarcity of an asset automatically makes it a sound investment.

According to Taleb, the fundamental confusion lies in equating necessary with sufficient. In his view, there are countless items with restricted supplies that hold little to no value in the market. He humorously pointed out examples such as pebbles from Skorpios, underwear worn by Churchill, books owned by Cary Grant to illustrate his point.

Talebs perspective is a departure from his earlier stance as a Bitcoin supporter. He was initially intrigued by Bitcoin during the global financial crisis and the WhatsApp Revolution in his home country, Lebanon. However, over time, Talebs enthusiasm waned, leading him to view Bitcoin as neither a safe haven nor a viable asset.

Bitcoins limited supply and digital scarcity have led many to consider it as a potential store of value, similar to gold. Some investors and institutions view it as a digital gold that can preserve wealth over time.

A store of value is an asset that can retain its purchasing power over extended periods. Bitcoins limited supply and decentralized nature appeal to those who seek an alternative to traditional stores of value, especially in times of economic uncertainty.

This isnt the first time Taleb has criticized the cryptocurrency market. Earlier this week, he decried attempts to artificially bolster market prices, stating, You may artificially prop up the price; you may paint the tape by coordinated manipulation. But in the end, the market is a market, an idiot is an idiot, & youth, inexperience, & ignorance are not virtues.

Taleb has consistently referred to Bitcoin as a magnet for idiots and likened the cryptocurrency market to a tumor. He prophesied that it would either kill the host or self-destroy. These searing criticisms underscore his belief that Bitcoins allure is driven more by speculation and hype than any inherent value.

As the cryptocurrency community grapples with Talebs unorthodox perspective, its clear that the debate surrounding Bitcoins value proposition continues to evolve.

The authors critique serves as a stark reminder that the cryptocurrency landscape is far from settled, with passionate proponents and critics offering contrasting viewpoints on its future trajectory.

Featured image from Norvan Reports

More:
Bitcoin Critique: 'Black Swan' Author Courts Controversy With Provocative Take - Bitcoinist

Read More..