Category Archives: Data Science
Analytics and Data Science News for the Week of August 26; Updates from Incorta, SAS Software, Sisu, and More – Solutions Review
The editors at Solutions Review have curated this list of the most noteworthy analytics and data science news items for the week of August 26, 2022.
Keeping tabs on all the most relevant analytics and data science news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last month, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy analytics and data science news items.
SAS Viya with SingleStore enables the use of SAS analytics and AI technology on data stored in SingleStores cloud-native real-time database. The integration provides flexible, open access to curated data to help accelerate value for cloud, hybrid and on-premises deployments. Through SingleStores data compression and SAS analytic performance, the companies aim to reduce the complexity of data management and integration, as well as the computational time required to train sophisticated models.
Read on for more.
The newly released data applications further extend the reach and agility of Incorta and the new Incorta Marketplace opens the door to customizations and packaged solutions from partners and the community. General availability of the Incorta component SDK (software development kit) along with an expanding list of data connectors and data destinations bolster the platforms openness and extensibility.
Read on for more.
SAS Viya with SingleStore enables the use of SAS analytics and AI technology on data stored in SingleStores cloud-native real-time database. The integration provides flexible, open access to curated data to help accelerate value for cloud, hybrid and on-premises deployments. Through SingleStores data compression and SAS analytic performance, the companies aim to reduce the complexity of data management and integration, as well as the computational time required to train sophisticated models.
Read on for more.
Sisu automatically diagnoses key drivers of metric change, automates trend and anomaly detection, predicts changes before they occur, and connects to third-party systems to help organizations make the right decisions and drive better business outcome. Sisu customers can now predict the future impact of metric change to confidently plan actions based on statistically relevant results and make better decisions.
Read on for more.
For consideration in future analytics and data science news roundups, send your announcements to the editor: tking@solutionsreview.com.
Widget not in any sidebars
Tim is Solutions Review's Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.
The rest is here:
Quick Study: How to Find the Right Data – InformationWeek
Theres a hoarder mentality lingering in the data collection and management business. No, they arent clinging to 30-year-old magazines or a roomful of plastic shopping bags. Its a strategy under which some enterprises insist on collecting every byte of data possible, just in case we need it someday.
The hoarder mentality isnt as prevalent as it was in the early days of big data, but some data professionals and marketers just cant let go of their virtual shopping bags.
Theres a delicate balancing act that enterprises face in this era of advanced analytics and AI. Not enough data can leave a company eating competitors dust. Too much data can be hard to manage, even can make the company legally liable under myriad data privacy regulations. Good decisions, trustworthy AI apps, and efficient data management call for enterprises to collect, collate, and use the right data for them.
In this Quick Study youll see some of InformationWeeks articles on data quality, data management, and how that right data can drive company success.
Data Quality: How to Show the ROI for Projects
Data quality is critical to enterprise success, but it also can be hard to quantify. Here are some key steps that you can take to measure and communicate the tangible return on investment for your data quality initiatives.
AI and Machine Learning Need Quality Assurance
Artificial intelligence and machine learning are not set and forget technologies. They need quality assurance to operate, and continue to operate, as intended.
How to Elevate Your Organizations Use of Data Analytics
Here are three best practices for leveling up your organizations use of analytics and attaining ROI with an enterprise analytics program. Think tools and company culture.
Seeking an Oasis in a Data Desert
Gaps in data quality, particularly due to supply chain issues during the pandemic, is becoming a serious influence on planning effective machine learning models.
The Cost of AI Bias: Lower Revenue, Lost Customers
A survey shows tech leadership's growing concern about AI bias, AI ethics, as negative events impact revenue, customer losses, and more.
3 Ways Data Problems Can Stop Your Business
In todays data-rich world, its important to not only focus on how data can be used to benefit business, but also where a flawed data strategy raises hurdles for the organization.
9 Ways to Reduce the Environmental Impact of Data
IT leaders can reduce the environmental impact of their data by considering a set of data sustainability principles, according to a Gartner analyst.
Priorities of Highly Successful Chief Data Officers
Is your data organization focused on the right areas? A survey of chief data officers looks at how these data executives can enable success in their organizations by the projects they choose to prioritize.
Chief Data Officers Help Steer Digital Transformations
Chief Data Officers are prioritizing data quality, ROI from data and analytics investments, and data sharing.
From AI to Teamwork: 7 Key Skills for Data Scientists
Todays data scientists need more than proficiency in AI and Python. Organizations are looking for specialists who also feel at home in the C-suite.
Why to Create a More Data-Conscious Company Culture
The drive to greater transparency in data requires efforts beyond breaking down data silos. Heres how and why to focus on cultivating a more data-literate workforce.
Data Science: A Guide to Careers and Team Building
Here's a collection of curated articles to help IT professionals learn how to make a career out of data science or how to build a team of data scientists.
CIOs Take Center Stage on ESG Strategies, Battling an Overflow of Data
With environmental, social and governance strategies forming a core part of organizational business plans, CIOs need to tap into various areas of expertise to ensure ESG efforts are organized and integrated enterprisewide.
Creating a Data Literate Culture in Your Organization
Everyone in the organization needs to understand how to access data, keep it secure and think critically about its potential use cases and applications.
An Insider's Look at Intuit's AI and Data Science Operation
Intuit's Director of Data Science speaks with InformationWeek about how the company's data operations have grown and evolved from just a few data scientists trying to sell executives on the value of data projects to becoming an AI-driven platform company.
Deployment Risk: AutoML & Machine Learning Without Expertise
Learn exactly what AutoML is, the value data scientists bring, and best practices on how to use AutoML to kickstart projects within your business.
IBM Databand.ai Acquisition Targets Data Observability
IBM says the deal to acquire the data observability startup will further Big Blue's own mission to provide observability for business.
GDPR Anniversary: Farewell to Global Data Lakes
Heres where were at with the regulation and the data challenges organizations are faced with today. While individuals are concerned about privacy, organizations struggle to balance data privacy with the need to level AI, machine learning and analytics to compete.
Data Fabrics: Six Top Use Cases
A data fabric management architecture optimizes access to distributed data while intelligently curating and orchestrating it for self-service delivery. Here's a look at some ways a data fabric may be able to help your organization.
AI Set to Disrupt Traditional Data Management Practices
The growth of advanced analytics such as machine learning and artificial intelligence is set to drive a disruption in traditional data management operations, according to Gartner.
Beyond the Data Warehouse: What Are the Repository Options Today?
With the rise of unstructured big data, a new wave of data repositories has come into use that dont always involve a data warehouse.
Here is the original post:
Centre asks Offshore IITs to Offer Courses in Data Science and AI – Analytics India Magazine
The ministry of education has proposed offshore IIT campuses to offer undergraduate degree programmes in areas such as data science and artificial intelligence. Under consideration by the Centre, the proposal to have these courses was based on feedback from Indian embassies abroad.
A survey report shows most of the universities in the target nations, have undergraduate programmes in conventional disciplines. From the feedback shared by the ambassadors of the identified host nations, the most frequently mentioned disciplines are related to computer science or IT, data sciences, AI, machine learning or robotics, electrical, electronics, mining, metallurgy, petroleum and energy, the report read.
Sign up for your weekly dose of what's up in emerging technology.
Further recommendations were made on the various modes of admission, including JEE, GATE, SAT, GRE, and JAM. A JEE or JEE (Advanced) exclusively for offshore campuses can be conceived in the future if its economically and logistically viable, said the report.
The committee noted that academic programmes such as Bachelor of Technology (BTech) and Masters of Technology (MTech) would be named Bachelor of Science (BS) and Masters of Science (MS), which are commonly used for international degrees.
The 17-member committee is led by IIT Council Standing Committee chairperson Dr K Radhakrishnan and also includes the directors of IIT Bombay, IIT Kharagpur, IIT Madras, IIT Delhi, IIT Kanpur, IIT Guwahati and IIT Dhanbad.
Flexible joint faculty contracts were proposed identifying provisions for deputing faculty members from the existing IITs to the proposed institutes abroad in their formative years. The committee set up by the Centre for the global expansion of IITs has Indian missions abroad and identified UK, UAE, Egypt, Saudi Arabia, Qatar, Malaysia and Thailand as prospective locations for offshore campuses under IITs brand name.
Continued here:
Centre asks Offshore IITs to Offer Courses in Data Science and AI - Analytics India Magazine
Where DataOps and Opportunities Converge – DevOps.com
In todays data age, getting data analytics right is more essential than ever. A robust data analytics implementation enables businesses to hit key performance metrics, build data and AI-driven customer experiences (think personalize my feed) and capture operational issues before they spiral out of control. The list of competitive advantages goes on, but the bottom line is that many organizations successfully compete based on how effectively their data-driven insights inform their decision-making.
Unfortunately, implementing an effective data analytics platform is challenging due to orchestration (DAG alert!), modeling (more DAGs!), cost control (Who left this instance running all weekend?!) and fast-moving data landscapes (data mesh, data fabric, data lakehouses ). Enter DataOps. Recognizing modern data challenges, organizations are adopting DataOps to help them handle enterprise-level datasets, improve data quality, build more trust in their data and exercise greater control over their data storage processes.
DataOps is an integrated and agile process-oriented methodology that helps businesses develop and deliver effective analytics deployments. It aims to improve the management of data throughout the organization.
While there are multiple definitions of DataOps, below are common attributes that encompass the concept while going beyond data engineering. Heres how we define it:
We broadly define DataOps as a culmination of processes (e.g., data ingestion), practices (e.g., automation of data processes), frameworks (e.g., enabling technologies like AI) and technologies (e.g., a data pipeline tool) that help organizations to plan, build and manage distributed and complex data architectures. DataOps includes management, communication, integration and development of data analytics solutions, such as dashboards, reports, machine learning models and self-service analytics.
DataOps is attractive because it eliminates the silos between data, software development and DevOps teams. The very promise of DataOps encourages line-of-business stakeholders to coordinate with data analysts, data scientists and data engineers. Via traditional agile and DevOps methodologies, DataOps ensures that data management aligns with business goals. Consider an organization endeavoring to increase the conversion rate of their sales leads. In this example, DataOps can make a difference by creating an infrastructure that provides real-time insights to the marketing team, which can help the team to convert more leads. Additionally, an Agile methodology can be employed for data governance, where you can use iterative development to develop a data warehouse. Lastly, it can help data science teams use continuous integration and continuous delivery (CI/CD) to build environments for the analysis and deployment of models.
The amount of data created today is mind-boggling and will only increase. It is reported that 79 zettabytes of data were generated in 2021 and that number is estimated to reach 180 zettabytes by 2025. In addition to the increasing volume of data, organizations today need to be able to process it in a wide range of formats (e.g., graphs, tables, images) and with varying frequencies. For example, some reports might be required daily, while others are needed weekly, monthly or on demand. DataOps can handle these different types of data and tackle varying big data challenges. Add in the internet of things (IoT), such as wearable health monitors, connected appliances and smart home security systems, and that introduces another variable for organizations that also have to tackle the complexities of heterogeneous data as well.
OK, so, how can we make this a reality? First, to manage the incoming data from different sources, DataOps can use data analytics pipelines to consolidate data into a data warehouse or any other storage medium and perform complex data transformations to provide analytics via graphs and charts.
Second, DataOps can use statistical process control (SPC)a lean manufacturing methodto improve data quality. This includes testing data coming from data pipelines, verifying its status as valid and complete, and meeting the defined statistical limits. This enforces the continuous testing of data from sources to users by running tests to monitor inputs and outputs and ensure business logic remains consistent. In case something goes wrong, SPC notifies data teams with automated alerts. This saves them time as they dont have to manually check data throughout the data life cycle.
Around 18% of a data engineers time is spent on troubleshooting. DataOps enables automation to help data professionals save time and focus on more valuable high-priority tasks.
Consider one of the most common tasks in the data management life cycle: Data cleaning. Some data professionals have to manually modify and remove data that is incomplete, duplicate, incorrect or flawed in any number of ways. This process is repetitive and doesnt require any critical thinking. You can automate it by either setting customized scripts or installing a built-in data cleaning software tool.
Additional processes that can be automated via DataOps include:
To develop your own DataOps architecture, you need a reliable set of tools that can help you improve your data flows, especially when it comes to crucial aspects of DataOps, like data ingestion, data pipelines, data integration and the use of AI in analytics. There are a number of companies thatprovide a DataOps platform for real-time data integration and streaming that ensures the continuous flow of data with intelligent data pipelines that span public and private clouds. Looking to increase the likelihood of success of data and analytics initiatives? Take a closer look at DataOps and harness the power of your data.
Read the original post:
NASA MUREP PBI/HBCU Data Science Equity, Access and Priority for Research and Education (DEAP) – Space Ref
Engagement Opportunities in NASA STEM 2022 (EONS2022)
Appendix N: MUREP PBI/HBCU Data Science Equity, Access and Priority for Research and Education (DEAP)
NOTICE OF FUNDING OPPORTUNITY (NOFO) RELEASED August 22, 2022
This National Aeronautics and Space Administration (NASA) Notice of Funding Opportunity (NOFO) entitled Engagement Opportunities in NASA STEM (EONS) 2022, solicits proposals for competitive funding opportunities in support of the Minority University Research and Education Project (MUREP), administered by NASAs Office of STEM Engagement (OSTEM). EONS-2022 is an omnibus announcement that includes a wide range of NASA STEM Engagement opportunities for basic and applied science and technology research and education. Specific opportunities will be issued periodically throughout the year as Appendices to this solicitation with individual requirements and milestones.
The following Appendix to EONS-2022 has been released: Appendix N: MUREP PBI/HBCU Data Science, Equity, Access and Priority for Research and Education (MUREP DEAP)
Full Proposals due at 5:00pm Eastern Time on Monday, October 24, 2022
NASA OSTEMs MUREP program solicits proposals from Predominately Black Institutions (PBIs) and Historically Black Colleges and Universities (HBCUs) to establish Data Science Institutes (DEAP Institutes) for data-intensive research in science and engineering that can accelerate discovery and innovation in a broad array of NASA Science Mission Directorate research domains. The DEAP Institutes will lead innovation by closely collaborating with NASA mentors for harnessing diverse data sources and developing and applying new methodologies, technologies, and infrastructure for data management and analysis research. The DEAP Institutes will support convergence between science and engineering research communities as well as expertise in Data Science foundations, systems, and applications. In addition, the DEAP Institutes will enable breakthroughs in science and engineering through collaborative, co-designed programs implementing NASA open science principles and architecture to formulate innovative data-intensive approaches to address critical national challenges.
Successful MUREP DEAP proposals will be funded as multi-year cooperative agreements not to exceed three (3) years. Please see the full Appendix N: MUREP DEAP for more details.
For general inquiries, please contact: MUREPDEAP@nasaprs.com.
A pre-proposal teleconference for the MUREP DEAP opportunity will be held on Wednesday, September 14, 2022 at 4:00 PM Eastern Time. During this session, the MUREP DEAP team will give an in-depth overview of the opportunity and highlight information contained in the EONS 2022 document regarding proposal preparation and requirements. Please visit the MUREP DEAP landing page in NSPIRES for information on how to join the call. Any changes to this session will be posted here as well. Proposers are strongly advised to check for updates prior to the call.
For more information regarding this opportunity, please visit the ENGAGEMENT OPPORTUNITIES IN NASA STEM (EONS-2022) page on the NASA Solicitation and Proposal Integrated Review and Evaluation System (NSPIRES) website and click on List of Open Program Elements.
Go here to see the original:
Take 10 With a Triton: Meet Angela Song, a Lover of Dogs, Data Analytics and Disneyland – University of California San Diego
Name: Angela Song
Position: Senior Director of the Office of Operational Strategic Initiatives (OSI); Leader of the OSI Tritonlytics Team
Years at UC San Diego: 20 years in January 2023
What she does at UC San Diego: As the senior director of OSI, Song helps spearhead strategic initiatives focused on continuously improving UC San Diegos operational efficiencies and strengthening the universitys commitment to a service- and people-oriented culture. Shes been with OSI since the beginning. When Chancellor Pradeep K. Khosla created the office in 2013, Song was one of three individuals selected from the Business and Financial Services (BFS) department to work with one of UC San Diegos external partners, McKinsey Consulting, on creating the universitys inaugural strategic plan.
Before OSI, Song served as the director of Administration and Metrics in Business and Financial Servicesa position she held since joining UC San Diego in 2003. She received her doctorate in Industrial and Organizational Psychology from UC Berkeley and her background in social science, data analysis, surveys and strategy management led to her role managing and overseeing internal performance metrics and campuswide surveys. Now, the surveys Song helped createincluding the Staff@Work Survey which opened Aug. 16, the Student Satisfaction Survey and the Academics@UCSD surveyare key metrics of the Chancellor's strategic plans.
Song also leads the Tritonlytics team, an OSI working group that aims to transform raw data into valuable insights and accelerate action. Her expertise is a key reason that Tritonlytics impact extends far beyond UC San Diego; the group assists other institutions nationwide in designing, administering and analyzing large-scale quantitative and qualitative surveys. Coast to coast, over 15 universities across four university systems participate in Tritonlytics higher education benchmarking program, which analyzes customer service, staff/faculty engagement and campus climate.
What she loves about UC San Diego: Song was inspired by how quickly and collectively the campus jumped into action in response to the COVID-19 pandemic.
Return to Learn is everything I love about UC San Diego; we collaborated, partnered, shared expertise and put aside our egos and titles to step upand step back, when necessaryso that our campus could do more than just survive, Song said. We were able to thrive during a global pandemic and let research and learning continue safely.
Even in the virtual era, Song felt the care and connections between staff, students and faculty at the university were strong. Many other campuses grew paralyzed; Song is proud of how our campus took the pandemic in stride.
UC San Diego shone as a beacon of hope that helped others to follow our lead, said Song. Not only do we do things right, we do the right things too.
Best advice received: Be grateful for everything, she shared with a smile. Sometimes our greatest blessings are in disguise.
Something unique in your workspace: Song is known around the OSI office as an avid dog lover. Her ten-year-old retired greyhound, Bali, and her five-year-old whippet, Hobbes, are never more than a few feet away from her desk.
Theyre always watching me and waiting for me to take a break so I can take them on a W-A-L-K-I-E, she shared with a laugh, spelling the word so her nearby pups wouldnt hear.
Favorite spot on campus: Its hard not to be a sunset lover, and Song is no exception. Near Scripps Pier, a few benches dot a patch of grass that looks out over the ocean. When she can, Song loves to catch a sunset there.
Something most people don't know about her: Where we spend our free time is an indicator of what were passionate about, she said. Her passions have led her to work with several non-profit organizations. As an elected board member of the Network for Change and Continuous Innovation, she helps higher education institutions across the country be transformative and agile lifelong models of learning. Song also serves as a board member of the Greater San Diego Whippet Association and volunteers with the Greyhound Adoption Center.
If she had one day to do anything she wanted: I would go to Disneyland! she said, grinning. I havent been to visit since the new Star Wars Land openedI cant wait to go!
Read more from the original source:
11 Fast-Growing Tech Occupations and How to Prepare for Them – Dice Insights
Tech unemployment remains low, and employers everywhere are on the hunt for technologists with all kinds of specializations. With that in mind, which technology occupations have seen the most growth over the past year?
Dices H1 Tech Job Reportlists the tech occupations that have dominated 2022 (so far) in terms of job-posting volume. As the following list makes clear, no matter what the organization or industry, theres always a driving need for technologists who can build and maintain the websites, databases, and tech stacks that keep everything running:
How much do some of these occupations pay? How can you break into them, and what kind of training and certifications do you need? Lets dig in!
First things first: software engineers and developers will remain in demand for quite some time to come. According to Lightcast (formerly Emsi Burning Glass), which collects and analyzes millions of job postings from across the country, the median software developer salary stands at $98,728 per year, rising even more with the right mix of experience and skills. (For comparisons sake,the latest Dice Tech Salary Reportplaced the average technologist salary at $104,566, up 6.9 percent between 2020 and 2021.)
Here are some of the specialized skills that pop up most frequently in software engineer and developer job postings; by mastering these skills, you have a higher likelihood of standing out during the job interview process:
Depending on your specialization, obtaining certain kinds of certifications might be a good idea. For example, if youre a cloud-focused developer who often works with Amazon Web Services (AWS) and Microsoft Azure, earning certifications in those platforms can help assure recruiters and hiring managers that you have the necessary skills to succeed in a particular role.
An effective software developer/engineer resume will also break down previous projects and how you used your skills to help your former employers succeed in their strategic goals. Remember, all resumes should be results-oriented: what did you deliver for those past organizations?
Business analysts have a difficult task: digesting and analyzing massive amounts of company and industry data for crucial insights. If youre interested in a business analyst career, its important to master analytics tools and skillsand then effectively convey your mastery of those skills via your resume and other application materials.
Business analyst interviews often feature questions about your past projects, your potential for cultural fit, and whether you can solve the potential employers analytics challenges (which may involve some technical questions). If you succeed, the potential rewards are vast: According to Lightcast, the median business analyst salary sits at $113,000 per yearand thats before you consider perks and benefits such as stock options, which can radically increase total compensation.
Data analysts often tackle very tactical analytics problems for organizations. As a result, you must tailor a data analyst resume to show youve had a significant impact on your organizations key projects. As part of that, you must master analytics and database tools and skillsand perfect your soft skills such as empathy and communication, since youll be communicating results to stakeholders.
Data analysts can earn a median salary of $73,067, according to Lightcast, which doesnt seem like a lotbut as with so many other tech roles, that number rises with skills and experience. A senior data analyst can expect to earn $93,000 per year, according to Glassdoor. Skills that can help a data analyst land a job and boost their salary include:
During a data analyst job interview, youll most likely need to demonstrate you can use these skills successfully to complete projects and deliver results on time.
Data scientists arent the same thing as data analysts; for starters, data scientists are usually tasked with tackling more strategic data questions. A couple of researchers also named data science the sexiest job of the 21st century, and who are we to argue with that?
There are multiple signs that data science will continue as an in-demand occupation for quite some time to come. According to areport by DevSkiller, data science was the fastest-growing tech skill throughout 2021, surpassing Python, PHP, and cybersecurity. HackerEarths 2021 Developer Survey found thatdata sciencewas the domain that is most coveted by both students and professional developers, with 24.4 percent of respondents showing interestwell ahead of Blockchain (in second) and cybersecurity (third). Combine that with data from Dices latest Job Report, and its clear this is an occupation with a very long runway.
The perfect data scientist resume should include awide range of data science skillsandqualities, as well asyou guessed itsoft skills. In addition to crunching tons of data for insights, data scientists must communicate their results in easy-to-understand ways to multiple stakeholders, including executives who might not grasp the intricacies of the craft. If you succeed in mastering data scientist skills, the potential salary is massiveespecially in some of the traditional tech hubs such as Silicon Valley.
Membership has its benefits. Sign up for a free Dice profile, add your resume, discover great career insights and set your tech career in motion. Register now
Read more:
11 Fast-Growing Tech Occupations and How to Prepare for Them - Dice Insights
New land valuation tool uses big data to inform urban planning and investment – UNSW Newsroom
A new commercial partnership between UNSW, PEXA and Frontier SI is launching a data-driven toolkit that rapidly calculates property valuations to inform urban development. The toolkit, hosted by the new venture Slate Analytics, allows users to visualise diverse infrastructure and planning scenarios and their effect on surrounding property values. It will facilitate data-driven solutions to support better city planning.
The digital toolkit, called the Rapid Analytics Interactive Scenario Explorer (RAISE), uses big data, artificial intelligence and advanced analytics to calculate property valuations across Australias residential property market. Its interactive, what-if scenario capability represents an industry-first for planning support systems.
RAISE provides a cyber-secure, scalable and commercially viable solution that reduces the risks and costs associated with land value analysis, he says.
This new commercial partnership brings together PEXAs expertise in data and digital property settlements, and Frontier SIs specialisation in spatial mapping and geodesy with UNSWs leading research on data science and the future of our cities to deliver a world-class platform, Pettit said.
Read more: Why cycling lanes should be on the fast track for cities
Scott Butterworth, ChiefData and Analytics Officer atPEXA, says: With its evidence-driven scenario modelling, RAISE will revolutionise the way we approach urban planning and development. It could also prove highly beneficial to Australias lending community,potentially expediting the mortgage approvals process on behalf of Australian homebuyers.PEXA is delighted to be leading this venture, delivering on real-world needs.
The digital toolkit is the result of a longstanding research partnership between Frontier SI and UNSW CFRC.
Graeme Kernich, CEO FrontierSI says,Through Slate Analytics, FrontierSI and UNSW has realised its vision to bring cutting-edge AI into the property valuation products and services space. The investment by PEXA will be critical to furthering this vision and expanding the impact that this capability will have for Australia and beyond.
The venture builds on previous research with UNSWs CFRC, Frontier SI, Commonwealth Bank of Australia, Liverpool City Council, the NSW Office of the Valuer General and property data management company Omnilink, that was funded by an Australian Government Cooperative Research Centres Projects (CRC-P) grant.
A new commercial partnership between UNSW, PEXA and Frontier SI is launching a data-driven toolkit that rapidly calculates property valuations to inform urban development.
The toolkit cleans, links and embeds diverse geospatial and property data from trusted government databases and industry partners, including from such sources as the Australian Bureau of Statistics (ABS), Geoscape, Valuer Generals data and transport, education and crime statistic agencies.
It includes aproperty market dashboard developed in response to COVID-19 that provides a daily snapshot of how the property market is performing in comparison to prior to the pandemic. It has an interactive map of Australia, identifying COVID case numbers based on data from state departments of health.
The dashboard provides graphs and charts of key property market metrics, including median property prices, auction clearance rates, the house value index and the performance of the ASX 200 real estate sector, to assist government, industry and communities in understanding the impacts COVID-19 is having on the property market.
The toolkit was used in recent research evaluating the proposed high-speed rail network on the east coast of Australia. It revealed that land value growth around the high-speed stations, estimated to increase by up to $140 billion, could be used to fund the network.
A significant portion of this could be dedicated to funding its construction and supporting the creation of liveable and vibrant cities and regions, Prof. Pettit says.
Read more: UNSW launches Cities Institute to help future-proof our cities
One of its aims is to improve users understanding of land economics, in particular, the influence that infrastructure provision and other key variables play in land values, Prof. Pettit says.
The tool considers structural attributes, such as the buildings quality, age and size; neighbourhood attributes, such as crime levels and socio-economic profile; and accessibility, such as the nearest transport connections, jobs centres and services.
It is accessed through a web browser allowing multiple users in diverse locations to compare and analyse different scenarios. Users can alter key assumptions within the tool, such as access to transport, schools, hospitals and zoning plans, and map their effect on property values.
For example, users can drag and drop new train stations into urban areas and predict the anticipated increase in value (value uplift) on surrounding properties associated with the accessibility benefits of improved transport infrastructure.
New and improved transport including new stations, faster trips, more frequent trips, upgraded services or higher volume services can connect residents to jobs and services and also connect businesses to labour, supplier and customer markets, Prof. Pettit says. As such, there is an expectation that residents and businesses will pay a premium for property serviced by this infrastructure.
The toolkit was used in recent research evaluating the proposed high-speed rail network on the east coast of Australia. Image: Shutterstock
Governments are interested in measuring value uplift to determine the viability of using value capture (taxing this uplift) to fund current or future transport infrastructure projects through policies, such as developer charges, stamp duty,land tax and betterment tax, he says.
Value capture is, in essence, a tax on the increase in land values associated with new or upgraded infrastructure, and these taxes can be used to offset infrastructure costs and fund supporting initiatives, such as new parks or increased affordable housing stocks.
There has been renewed national policy interest in value capture in Australia in the debate around the National Settlement Strategy as well as the national Smart Cities Plan (2016) and the Greater Metropolitan Sydney Planning Strategy (2018), and internationally, he says.
The expense involved in engaging specialist valuation consultants means that fewer options are typically explored by planners and decision-makers, and effective value capture mechanisms are seldom employed in Australian infrastructure planning, he says.
UNSW will retain a stake in the new venture, supporting it through ongoing research and development.
The tool makes an important contribution to building more equitable and productive future cities, helping planners, councils and governments maximise investments to get the best outcomes, including better public transport, additional green space and more affordable housing stock for the whole community, Prof. Pettitsays.
Read more:Family-friendly apartments are key to a compact city
RAISE was co-designed through a series of workshops with research partners, state government-based valuers and local government urban planners to ensure its functionality and user experience aligned with end-user expectations.
The workshops evaluated rapid prototypes to optimise the design according to its data inputs, modelling and scenario formulation and visualisation, interface and workflow. Feedback shaped further development, extending the tools base functionality, model and data integration, and collaborative features through a series of iterations.
For example, feedback from stakeholders in the co-design workshops prompted the introduction of school-quality indicators into the modelling, Prof. Pettit says. Properties in catchments of schools performing well, as measured by NAPLAN results, were found to be valued higher.
The need for digital technologies, such as data-driven scenario tools, will continue to grow with rapid urbanisation and global population growth,however, the inherent complexity of big data as well as the diverse players and conflicting agendas mean the tools work best in tandem with expert knowledge, he says.
These tools and technologies work hand in hand collaboratively with planners and those key decision-makers who shape the future of the city Were all striving towards more liveable cities, more sustainable cities, more resilient, productive [and] inclusive cities. The tools are used to support those sorts of big-picture challenges.
Read the original here:
New land valuation tool uses big data to inform urban planning and investment - UNSW Newsroom
Boosting mortgage processes with data-driven ways of working – Consultancy.eu
The use of data science can offer mortgage providers major opportunities to enhance their services and operations. Chris van Winden and Mando Rotman from IG&H outline three use cases of adopting more data driven ways of working.
The ambassadors of mortgage providers are financial advisors, who are in continuous contact with customers. By using data to analyse how advisors operate, significant improvements can be made in processes and outcomes.
In one concrete example, IG&H helped a mortgage client in the Netherlands analyse the end-to-end value chain of advisors, from the types of customers they serve to the advice they give and how they follow up. Leveraging data, we managed to help the client improve results and embed a cycle of continuous improvement.
Using machine learning for instance can significantly reduce the workload for employees and allow standard loan applications to be evaluated much faster. This in turn frees up human capacity, which means that mortgage providers can allocate more time towards serving their customers.
In one client example, IG&H worked with a client to automate the mortgage application process with the help of intelligent, self-learning algorithms. Prior to the roll out of machine learning, all cases were assessed manually. With the help of the models, only 25% of cases are currently reviewed manually by specialists.
Proactive management of existing customers requires using data to anticipate needs and providing tailored messaging. Intelligent models such as artificial intelligence can help predict and advise on how to approach customers in an optimal way.
Data can also help with assessing customer segments, and predicting their needs and behaviours.
Meanwhile, a technological solution such as a chatbot can improve customer service. From a clients perspective, a chatbot is always available 24/7, enabling customer to ask questions when it is most convenient for them. While still in their infancy, well designed AI-chatbots can provide a good level of service to potential or existing customers. As a bonus, the chatbot can provide valuable insights through its interactions with customers.
More here:
Boosting mortgage processes with data-driven ways of working - Consultancy.eu
Correlation One Partners with OneTen to Drive Economic Prosperity for Black Talent in the Data Field – PR Newswire
Correlation One forms strategic partnership with nation's leading coalition committed to hiring and advancing Black Americans without four-year degrees into family-sustaining jobs
NEW YORK, Aug. 25, 2022 /PRNewswire/ -- Correlation One today announced that it has joined forces with OneTen, a coalition designed to close the opportunity gap for Black talent in the United States by working with America's leading executives, companies, and talent developers to hire and advance one million Black Americans without four-year degrees into family-sustaining roles over the next 10 years. This strategic partnership will allow Correlation One to share and shape best practices for sourcing, developing, and hiring Black talent.
The strategic partnership will help advance Black Americans without four-year degrees into lucrative data careers.
As an endorsed OneTen talent developer, Correlation Oneis committed to increasing the hiring of Black talent without four-year college degrees into family-sustaining jobs by improving their hiring, retention, upskilling, and advancement practices to support a more diverse workforce and advance economic prosperity for all. As a OneTen coalition member and award-winning data skills training provider, Correlation One will train emerging data analysts and engineers and connect them with employers through a distinctive "last mile to jobs" approach.
"As part of OneTen's growing portfolio of career development providers bridging the gap between rising Black talent and top employers, we're proud to lend a hand in the advancement of Black Americans' financial and professional goals," said Sham Mustafa, Correlation One Co-Founder and Co-CEO. "At the same time, we know that by diversifying their data talent, our forward-thinking Employer Partners can leverage a wider range of capabilities and viewpoints to advance business goals."
Offered 100% free to learners from groups historically underrepresented in the data field, Correlation One training programs have received more than 150,000 applications and trained over 7,000 students. Over 60% of the trainees are Black a strikingly significant number given that Black professionals comprise only 7.3% of U.S. data analysts. The U.S. Bureauof Labor Statistics anticipates strong job growth within the data field at a rate of 27.9% through 2026, with employers willing to pay well for emerging talent.However, significant and pervasive structural barriers remain in place for rising data talent to acquire the technical skills necessary to secure professional positions in the lucrative, growing data economy.
While the job market continues to grow amid rising inflation and a looming recession, the racial wealth gap in America remains wide. This is largely due to the lack of access to quality, well-paying jobs that do not require college degrees: 79% of jobs paying more than $50,000 require a four-year college degree, which automatically excludes the 76% of Black Americans over age 25 with relevant experience who don't have baccalaureate degrees. With Black professionals representing less than 8% of the technology workforce, harnessing multi-stakeholder partnerships is vital to spearheading diversity and fostering pathways to success.
"Connecting Black talent through the kind of data skills training and upskilling Correlation One offers is critical to addressing the growing needs in today's data economy," said Maurice Jones, OneTen CEO. "We're thrilled to partner with Correlation One - to unlock workforce opportunities for Black talent in the data-driven jobs of the future and implement the skills that will enable Black talent to achieve generational wealth."
About Correlation One
Correlation One is a technology company whose mission is to create equal access to the data-driven jobs of tomorrow and believes that data literacy is the most important skill for the future of work. The company makes data fluency a competitive edge through global data science competitions, rigorous data skills assessments, and enterprise-focused data science education. Correlation One's solutions are used by some of the most elite employers all around the world in finance, technology, healthcare, insurance, consulting, and governmental agencies. Since launching in 2015, Correlation One has built an expert community of 250,000+ data scientists and 600+ partnerships with leading universities and data science organizations in the US, UK, Canada, China, and Latin America.
About OneTen
OneTen is a coalition of leading chief executives and their companies who are coming together to upskill, hire and promote one million Black individuals who do not yet have a four-year degree into family-sustaining jobs with opportunities for advancement over the next 10 years. OneTen connects employers with talent developers including leading nonprofits and other skill-credentialing organizations who support development of diverse talent. By creating more equitable and inclusive workforces, we believe we can reach our full potential as a nation of united citizens. OneTen recognizes the unique potential in everyone every individual, every business, every community to change the arc of America's story with Black talent. Join us at OneTen.org, where one can be the difference.
Media Contacts:Pamela Price[emailprotected]
SOURCE Correlation One
Read more here: