EM360Tech에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 EM360Tech 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
EM360Tech에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 EM360Tech 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Turn data overwhelm into data-driven success. Led by an ensemble cast of expert interviewers offering in-depth analysis and practical advice to make the most of your organization's data.
EM360Tech에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 EM360Tech 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Turn data overwhelm into data-driven success. Led by an ensemble cast of expert interviewers offering in-depth analysis and practical advice to make the most of your organization's data.
In this episode of the Don’t Panic, It’s Just Data podcast , Kevin Petrie, VP of Research at BARC, is joined by Nidhi Ram, Vice President of Global Services Strategy and Operational Excellence at Precisely. The duo explore the idea of focusing on data modernisation and improving accessibility rather than constantly implementing new technologies. Both Petrie and Ram highlight the importance of traditional mainframes, especially in modern data strategies. They delve into how companies can combine cloud tools with data in order to handle diverse data systems. This removes the need for replacing the mainframe, keeping the data accessible for all users. Going into the critical role of data quality and governance — especially in the age of AI — Ram emphasises that “garbage in, garbage out” has never been more relevant, as AI outputs are only as good as the data feeding the model. What's needed is a more comprehensive approach to data integration, quality, governance, and enrichment that helps ensure data is always ready for confident business decisions. Listen to this latest episode to learn how Precisely’s Data Integrity Suite provides a comprehensive approach to data modernisation. Takeaways Data modernisation is about accessibility, not technology. The mainframe continues to play a crucial role in data strategies. High-quality data is essential for successful AI initiatives. Data governance is critical to comply with regulations and ensure data quality. Cloud solutions offer flexibility, but on-premise systems provide control. Companies need to adapt to a heterogeneous data environment. Integrating people and processes is key to successful data programs. Future data roles will require broad functional knowledge rather than deep technical skills. Chapters 00:00 Introduction to Data Modernization 02:56 Understanding Data Modernization 06:01 Challenges in Data Modernization 08:53 The Role of AI in Data Strategies 11:56 Data Quality and Governance 15:08 Cloud vs On-Premise Data Solutions 18:14 Adapting to Diverse Data Environments 20:47 Advice for Modernizing Data Strategies 23:58 The Importance of People and Process 27:07 Future Skills for Data Teams About Precisely Precisely is a leading global data integrity provider, ensuring organisations have accurate, consistent, and contextual data. In today's data-driven world, where businesses rely on information for critical decisions, data integrity is crucial. Precisely offers a comprehensive portfolio of solutions designed to transform raw data into a reliable asset. Trusted by over 12,000 organisations in more than 100 countries, Precisely plays a critical role in helping businesses effectively leverage their data. By providing reliable software and strategic services, Precisely empowers organisations to confidently embark on their AI, automation, and analytics initiatives. High-quality, trustworthy data is the bedrock for successful AI models, efficient automation processes, and insightful analytics. Without it, these initiatives risk delivering inaccurate results and misleading insights. Precisely's commitment to data integrity allows businesses to make confident decisions and achieve strategic objectives.…
In this episode of the Don't Panic, It's Just Data podcast, hosted by EM360Tech's podcast producer, Shubhangi Dua speaks to Donnie Owsley from Snohomish County, and Jeff Burton and Tom Lavery from the University of British Columbia. All of the speakers will be presenting at the upcoming Peak of Data and AI event, organised by Safe Software , the creators of FME. Scheduled to take place in Seattle from May 5th to 8th, 2025, The Peak is an exciting gathering for data and AI innovators. This conversation offers a preview of some of the practical applications and insights that will be shared at the event. The podcast also talks about the development of creative solutions for enhancing accessibility in urban environments. The UBC speakers particularly refer to their creation of an accessible university campus navigation system, a project that showcases the power of integrating FME with platforms like ArcGIS. This discussion spotlights the challenges and ingenuity involved in building inclusive wayfinding solutions that cater to the diverse needs of a community. The conversation sheds light on some tangible ways in which FME is being used across different sectors to tackle specific challenges and boost creative innovations. It provides valuable context for the types of practical knowledge and problem-solving approaches that will be central to The Peak of Data and AI event. For further information on what we’ve talked about and to register for The Peak of Data and AI event in Seattle, please head over to peakofdataintegration.com . Key Highlights Discover how to use tools like FME for preemptive IT issue resolution. Learn the approach to creating inclusive navigation systems with FME and ArcGIS. Get practical insights into current industry applications. Preview actionable data and AI solutions. Explore the versatile application of FME in your organisation. About Safe Software Founded in 1993, Safe is headquartered in Surrey, BC with over 200 team members and counting. We’re always looking for talented individuals with diverse backgrounds who are determined to learn and grow. Over 20,000 organisations around the world use FME in industries like AEC, government, utilities, and transportation to maximise the value of their data.…
Takeaways #Financeteams are often resistant to change, clinging to outdated tools. Real-time data integration can help break down data silos and better decision-making. The shift from FP&A to XP&A emphasises collaboration across departments. #Datagovernance is crucial when handling financial data. Real-time data enables more accurate forecasting and budgeting. Organisations should define their data strategy before implementation. Proactive adaptation to data technologies is essential for future success. Summary In this episode of " Don't Panic, It's Just Data ," host Christina Stathopoulos explores the world of real-time analytics and its impact on financial decision-making. She is joined by Thomas Gore, insightsoftware's Director of Product Management for extended planning and analysis (XP&A), and Cody Riemenschneider, Director of Solutions Engineering, and they discuss the challenges and opportunities of integrating real-time data into finance. We explore the issue of data silos and how real-time AI integration can break them down. Thomas explains: "Each department has its own data report, its own planning tools, which are mostly Excel files... They don't collaborate." Listen to our latest podcast now for an insightful conversation on how real-time data is reshaping the future of #financialplanning and analysis. For the latest tech insights visit: EM360Tech.com…
"So you want trusted data, but you want it now? Building this trust really starts with transparency and collaboration. It's not just technology. It's about creating a single governed view of data that is consistent no matter who accesses it, " says Errol Rodericks, Director of Product Marketing at Denodo . In this episode of the 'Don't Panic, It's Just Data' podcast, Shawn Rogers, CEO at BARC US, speaks with Errol Rodericks from Denodo . They explore the crucial link between trusted data and successful AI initiatives. They discuss key factors such as data orchestration, governance , and cost management within complex cloud environments. We've all heard the horror stories – AI projects that fail spectacularly, delivering biased or inaccurate results. But what's the root cause of these failures? More often than not, it's a lack of focus on the data itself. Rodericks emphasises that "AI is only as good as the data it's trained on." This episode explores how organisations can avoid the "garbage in, garbage out" scenario by prioritising data quality, lineage, and responsible AI practices. Learn how to avoid AI failures and discover strategies for building an AI-ready data foundation that ensures trusted, reliable outcomes. Key topics include overcoming data bias, ETL processes, and improving data sharing practices. Takeaways Bad data leads to bad AI outputs. Trust in data is essential for effective AI. Organisations must prioritise data quality and orchestration. Transparency and collaboration are key to building trust in data. Compliance is a responsibility for the entire organisation, not just IT. Agility in accessing data is crucial for AI success. Chapters 00:00 The Importance of Data Quality in AI 02:57 Building Trust in Data Ecosystems 06:11 Navigating Complex Data Landscapes 09:11 Top-Down Pressure for AI Strategy 11:49 Responsible AI and Data Governance 15:08 Challenges in Personalisation and Compliance 17:47 The Role of Speed in Data Utilisation 20:47 Advice for CFOs on AI Investments About Denodo Denodo is a leader in data management. The award-winning Denodo Platform is the leading logical data management platform for transforming data into trustworthy insights and outcomes for all data-related initiatives across the enterprise, including AI and self-service. Denodo's customers in all industries all over the world have delivered trusted AI-ready and business-ready data in a third of the time and with 10x better performance than with lakehouses and other mainstream data platforms alone.…
Takeaways #CMDM is essential for creating a unified and accurate view of the customer. Trust comes from how responsibly data is handled. Ethical data use is a competitive advantage and builds trust. Executive sponsorship and cross team buy-in is crucial for MDM projects. A strong customer #datastrategy is a competitive edge, not a side project. Summary Customer Master Data Management (CMDM) as Matthew Cawsey, Director of Product Marketing at Stibo Systems describes, "about having a master record of a customer, as opposed to many organisations where customer data gets created and authored in potentially dozens of different CRMs, ERPs, and finance systems.” In this episode of our ' Don't Panic, It's Just Data' podcast series, Christina Stathopoulos, Founder at Dare to Data, speaks with Matthew Cawsey and Arnjah Dillard from Stibo Systems. Matthew and Arnjah explain the importance of CMDM and CXDC for maintaining positive customer engagement. Learn how to move from basic personalisation, ethically use #data, leverage #AI for a better customer experience. Let’s transform your messy data into a competitive advantage. For the latest tech insights visit: EM360Tech.com…
In the latest episode of the Don’t Panic It’s Just Data podcast, we connected with speakers who provided a preview of their presentations at the upcoming Peak of Data and AI event in Seattle organised by Safe Software from May 5-8, 2025. This premier gathering, hosted by Safe Software , the creators of FME , will be a hub for data and AI innovators, and this podcast episode offers an exclusive look into what attendees can expect. Our conversation featured Margaret Smith and Reshma Joy from the West Virginia Department of Transportation. They shared their crucial work in ensuring data integrity through rigorous validations of their Linear Reference System data. This foundational work underpins much of their operational efficiency and decision-making. They further revealed how they’ve achieved seamless integration between Survey123 and their R&H data, showcasing a strong example of how disparate systems can be harmonized for greater insight. This presentation will provide attendees with actionable strategies for enhancing data quality and interoperability. We also spoke to Bruno Blanco, a GIS Engineer from Shelby County 9-1-1. Bruno walked us through how FME supports critical aspects of their 911 addressing workflow—particularly data aggregation , QA/QC, and attribution—within a larger automation framework. This work highlighted the power of automation in critical public safety infrastructure. By streamlining their addressing processes, Shelby County 9-1-1 is improving response times and ensuring more accurate location data, ultimately saving lives. Bruno’s presentation will offer valuable insights into how organisations can leverage FME to automate complex workflows and enhance operational efficiency. This episode serves as a compelling preview for the main event at The Peak of Data and AI. If you’d like to learn more about Bruno and Shelby County 9-1-1’s story, check out their success story with Safe. For further information on what we’ve talked about and to register for The Peak of Data and AI event in Seattle, please head over to peakofdataintegration.com. Takeaways Data validation is essential for accurate operations. FME enables seamless integration of disparate systems. Automation of critical processes improves public safety. Networking and community learning are key benefits of The Peak. Breakout sessions provide valuable hands-on FME knowledge. AI is increasingly influencing data integration workflows.…
"There's been a lot of wrangling of data, a lot of wrangling of humans as well, which is a big theme for today," says Warwick Leitch, Product Management Director at insightsoftware. In this episode of the ' Don't Panic, It's Just Data ' podcast, Debbie Reynolds, CEO and Chief Data Privacy Officer at Debbie Reynolds Consulting LLC, speaks with Leitch from insightsoftware. They discuss the vital role of financial strategy and collaborative planning, particularly as it pertains to the decisions made by IT executives. The question they address is: In a world awash with data, how do we transform it into actionable insights? Warwick shares his wealth of experience, offering practical advice and illuminating the path to successful budgeting and forecasting. One such challenge addressed in the podcast is how organisations are securing executive buy-in. "And 51 percent of people find it difficult to engage senior executives to buy into the process, which is a roadblock. And 57 percent of organisations struggle cross-functionally," Warwick reveals. It's not just about the numbers. Warwick also emphasises the human element, reminding us that "Without people, we don't have anything." In an era where AI looms large, it's crucial to remember that technology serves to enhance, not replace, human collaboration. Tune in to the podcast and learn how to navigate the complexities of financial strategy and collaborative planning. Takeaways Executive buy-in is crucial for successful budgeting. A clear vision helps guide the budgeting process. Thoughtful execution is key to effective planning. Fostering a culture of collaboration enhances participation. Data accuracy is vital in today's fast-paced environment. Avoid overcomplicating the budgeting process. Gamification can improve engagement in budgeting. AI can significantly streamline forecasting and reporting. Regularly updating forecasts leads to better accuracy. Understanding business measures is essential for effective planning. Chapters 00:00 Introduction to Collaborative Financial Planning 05:01 The Importance of Executive Buy-In 10:09 Thoughtful Execution in Budgeting 15:01 Fostering a Culture of Collaboration 19:48 Defining Business Measures and Data Accuracy 24:52 Common Pitfalls in Collaborative Budgeting 29:52 The Future of Collaborative Planning with AI…
The digital age is fueled by data, and the engines powering that data are data centres. However, this growth comes at a significant energy cost. In the latest episode of the EM360Tech Don’t Panic It’s Just Data podcast, Shubhangi Dua speaks with Rolf Bienert, Technical & Managing Director of the OpenADR Alliance , to shed light on the urgent need for sustainable energy practices within the data centres industry. In this episode, we discuss the stark reality of escalating energy consumption, driven by factors like the rise of AI, and the critical importance of moving beyond superficial "green" initiatives to implement genuine, impactful solutions. From talking about the historical context of data centres energy usage, the evolution of energy demands and the challenges of achieving net-zero goals, Rolf provides valuable insights into innovative solutions such as smart grids, microgrids, and virtual power plants. These hold immense potential for managing energy distribution efficiently and sustainably. Beyond technological solutions, the podcast addresses the critical role of regulatory frameworks and industry standards in fostering sustainable practices. The frameworks are necessary to adapt to modern energy consumption patterns, ensuring interoperability and reducing costs. It spotlights the importance of collaboration between IT and utility sectors, as well as open communication with the public, to address concerns about energy consumption and build trust. Takeaways Data centres are increasingly becoming significant consumers of energy. Sustainability in data centre is often perceived as branding rather than genuine effort. AI's demand for processing power is escalating energy needs. Smart grids are essential for managing energy distribution effectively. Microgrids and virtual power plants offer promising solutions for energy sustainability. Enterprises can leverage renewable energy to become energy providers. Regulatory frameworks need to adapt to modern energy consumption patterns. Standards are crucial for ensuring interoperability and reducing costs. Collaboration between IT and utility sectors is vital for sustainable energy management. Open communication is key to addressing public concerns about energy consumption. Chapters 00:00 Introduction to Data Centres Sustainability 03:22 Historical Perspective on Data centres Energy Consumption 08:32 The Role of Smart Grids in Energy Management 12:40 Understanding Microgrids and Virtual Power Plants 21:30 Enterprise Strategies for Sustainable Data Centres 29:51 Regulatory Challenges and Opportunities 32:34 The Importance of Standards in Data Centres Growth…
As organisations strive to stay competitive in the age of AI, data trust has become a critical factor for success. Without it, even the most advanced AI initiatives are bound to fall short. With the rapid advancement of technology, prioritising trust in data is essential for unlocking AI's full potential and driving meaningful results. Conversely, a lack of data trust can undermine decision-making, operational efficiency, and the success of AI initiatives. In this episode, Christina Stathopoulos, Founder at Dare to Data, speaks to Jay Limburn, Chief Product Officer at Ataccama, to explore these pressing topics. Together, they share actionable insights, real-world examples, and innovative strategies to help businesses harness the power of trusted data. Key Takeaways Data trust is essential for confident decision-making. AI can significantly reduce mundane tasks. Organizations must focus on their data strategy. Customer experience is a key area for AI application. Data teams are crucial for successful AI initiatives. Proactive data management is becoming the norm. The chief data officer's influence is growing. Data quality and security are critical challenges. AI can enhance regulatory reporting processes. Trust in data is vital for successful AI projects. Chapters 00:00 - Introduction to Data Trust and AI Integration 09:36 - The Role of AI in Operational Efficiency 12:47 - Balancing Short-term and Long-term Data Priorities 15:00 - Enhancing Customer Experience through AI 19:08 - Aligning Workforce and Culture for AI Success 21:03 - Innovative Strategies for Data Quality and Security 24:35 - Final Thoughts on Data Trust and AI Success…
The San Antonio River Authority (SARA) has experienced a transformative shift in data management, thanks to the powerful capabilities of FME. By integrating FME, SARA has streamlined data integration, improved efficiency, and enhanced decision-making processes across multiple departments. FME’s ability to automate data transformation, standardise formats, and manage large volumes of spatial data has allowed the authority to optimise workflows, reduce manual errors, and accelerate project timelines. A key highlight of SARA’s success with FME is its use in predictive flood modelling and the standardisation of data workflows. By leveraging FME, SARA can more accurately predict flood risks, improving public safety and response times. This innovation not only enhances internal operations but also helps SARA lead in sustainable water management. With its versatility in handling diverse data sources and streamlining communication between systems, FME is a powerful investment for organisations seeking to improve operational efficiency and long-term strategic decision-making. In this episode, Debbie Reynolds, Founder and Chief Data Privacy Officer at Debbie Reynolds Consulting, speaks to Jordan Merson, Enterprise Applications Supervisor at San Antonio River Authority, about the game-changing impact of FME. Key Takeaways: Data management challenges often stem from a lack of standardisation. FME allows for the integration of various data sources seamlessly. Predictive modelling can enhance emergency response efforts. FME provides tools for real-time data monitoring and alerts. The user-friendly interface of FME accelerates onboarding for new team members. FME can handle both spatial and non-spatial data effectively. Collaboration and knowledge sharing are key to successful data management. Chapters: 00:00 - Introduction to Data Management and FME 02:30 - Jordan's Journey in IT and Data Management 05:43 - Challenges Before FME Implementation 08:36 - Transformative Impact of FME on Data Processes 10:02 - Real-World Applications of FME at San Antonio River Authority 14:06 - Predictive Flood Modeling and Emergency Operations 16:31 - Standardization and Efficiency with FME 17:59 - Final Thoughts and Recommendations on FME…
Summary This discussion explores the complexities and strategies surrounding edge computing and data management, highlighting the importance of security, the challenges of vendor lock-in, the implications of data repatriation, and the necessity of ensuring high-quality data for AI systems. It emphasises the need for organisations to balance edge processing with centralised storage while future-proofing their data strategies against rapid technological changes. Building on their discussion, Jimmy Tam highlights the transformative role of edge computing in modern data management, emphasising the importance of governance, compliance, and interoperability to address the challenges of data sprawl and vendor lock-in. Takeaways Edge computing is transforming how organisations manage data. Security at the edge is paramount to prevent intrusions. Data sprawl poses significant challenges for edge data management. Governance and compliance are essential for effective data management. Vendor lock-in can limit flexibility and adaptability in technology. Data interoperability is crucial for avoiding vendor lock-in. Data repatriation is a growing trend among organisations. AI systems require access to comprehensive data for training. Speed of data relevance is critical for effective AI applications. Flexibility in data strategies is essential for future-proofing organisations. Sound Bites "Data sprawl is a significant problem." "Governance and compliance are crucial." "Data repatriation is absolutely real." "Speed of data relevance is critical." Chapters 00:00 Introduction to Edge Computing and Data Management 02:53 Security Strategies for Edge Data 06:06 Vendor Lock-In and Data Interoperability 09:00 Data Repatriation and Cost Optimisation 11:57 Ensuring Quality Data for AI Systems 14:46 Balancing Edge Processing and Centralised Storage 17:59 Future-Proofing Data Strategies…
In today’s data-driven world, real-time analytics has become a cornerstone for businesses seeking to make smarter, faster decisions. From enhancing user experiences to enabling continuous intelligence, the ability to process data in real-time is transforming industries. Yet, challenges such as legacy systems and the demand for innovative data management approaches persist. This episode explores the evolution of real-time analytics and its crucial role in modern data processing. We delve into how technology is reshaping the way businesses interact with data and the importance of user-centric design in creating powerful data applications. Joining Christina Stathopoulos, Founder of Dare to Data, is Rahul Rastogi, Chief Innovation Officer at SingleStore. Together, they discuss the necessity of real-time data in today’s fast-paced business environment, tackle the challenges organizations face in adapting to this shift, and highlight how data serves as the foundation for AI-driven innovation. Don’t miss this insightful discussion packed with practical strategies and forward-looking ideas! Key Takeaways Real-time analytics has evolved from a luxury to a necessity. Streaming technologies like Kafka and Spark have revolutionized data processing. Legacy systems are often monolithic and ill-suited for real-time analytics. Modern data platforms enable easier data management and integration. Continuous intelligence requires a solid analytics foundation. User experience is critical for the adoption of data applications. Organizations must treat data as a valuable asset. Data governance and quality are essential for effective analytics. The separation of compute from storage enhances scalability. Real-time processing with low latency improves user satisfaction. Chapters 00:00 - Introduction to Real-Time Analytics 06:14 - The Evolution of Technology in Data Processing 10:09 - Challenges of Legacy Systems 14:23 - Innovative Approaches to Data Management 18:06 - Building a Foundation for AI Innovations 21:27 - User Experience in Data Applications…
The convergence of Master Data Management (MDM) and Artificial Intelligence (AI) is transforming how businesses harness data to drive innovation and efficiency. MDM provides the foundation by organising, standardising, and maintaining critical business data, ensuring consistency and accuracy across an organisation. When paired with AI, this clean and structured data becomes a powerful asset, enabling advanced analytics, predictive insights, and intelligent automation. MDM and AI help businesses uncover hidden patterns, streamline operations, and make more informed decisions in real-time. By integrating MDM with AI, organisations can move beyond simply managing data to actively leveraging it for competitive advantage. AI algorithms thrive on high-quality, well-structured data, and MDM ensures just that—minimising errors and redundancies that could compromise results. This synergy empowers companies to personalise customer experiences, optimise supply chains, and respond proactively to market changes. In this episode, Kevin Petrie, VP of Research at BARC US, speaks to Jesper Grode, Director of Product Innovation at Stibo Systems, about the intersection between AI and MDM. Key Takeaways: AI and master data management should be integrated for better outcomes. Master data improves the quality of inputs for AI models. Accurate data is crucial for training machine learning models. Generative AI can enhance product launch processes. Prompt engineering is essential for generating accurate AI responses. AI can optimise MDM processes and reduce operational costs. Fast prototyping is vital for successful AI implementation. Chapters: 00:00 - Introduction to AI and Master Data Management 02:59 - The Synergy Between AI and Master Data 05:49 - Generative AI and Master Data Management 09:12 - Leveraging Master Data for Small Language Models 11:58 - AI's Role in Optimizing Master Data Management 14:53 - Best Practices for Implementing AI in MDM…
As cloud adoption grows, so do the challenges of managing costs effectively. Cloud environments offer scalability and flexibility but often come with hidden fees, unpredictable expenses, and resource sprawl that can quickly inflate budgets. Without the right tools and strategies, businesses may struggle to track spending, identify waste, and maintain budget alignment. Usage-based reporting is pivotal in this process, providing the granular visibility needed to understand real-time consumption patterns and optimise costs. Businesses can align expenses directly with value-driven activities by tracking how, where, and when resources are used. From preventing overspending to fostering accountability, usage-based reporting empowers teams to proactively manage their cloud expenses, turning cloud cost management into a strategic advantage rather than a recurring headache. In this episode, George Firican, Founder of LightsOnData, speaks to Rem Baumann, Resident FinOps Expert at Vantage, about usage-based reporting and its benefits. Key Takeaways: Organisations face challenges in tracking complex cloud costs. Usage-based reporting provides context to cloud spending. Metrics should align with business goals for effective decision-making. Communication between finance and engineering teams is crucial. Identifying cost optimisation opportunities can lead to significant savings. Different industries require customised cost metrics. Cloud providers offer basic tools, but deeper insights are needed. Regular monitoring of metrics ensures financial transparency. Chapters: 00:00 - Introduction to Cloud Cost Management 03:03 - Understanding Cloud Complexity and Cost Tracking 05:53 - The Role of Usage-Based Reporting 09:06 - Metrics for Cost Optimization 12:02 - Industry-Specific Applications of Cost Metrics 14:49 - Aligning Cloud Costs with Business Goals 18:09 - Conclusion and Key Takeaways…
Data custodianship today involves managing and protecting vast quantities of sensitive information, requiring organisations to ensure security, regulatory compliance, and ethical usage. It’s not just about protecting data from breaches but also about responsible storage, access, and deletion that aligns with strict industry standards and evolving privacy regulations. The ethical dimensions of data custodianship add further complexity as organisations balance the need for data-driven insights with privacy rights and transparent usage. Mismanagement can lead to significant financial, legal, and reputational risks, making effective custodianship essential for maintaining customer trust and regulatory compliance. In this episode, Paulina Rios Maya, Head of Industry Relations, speaks to Debbie Reynolds, Founder and Chief Data Privacy Officer at Debbie Reynolds Consulting, about compliance with global regulations, the role of AI in data management, and the necessity of human oversight in technology. Key Takeaways: Data custodianship emphasises that data belongs to individuals, not companies. Organisations must have a comprehensive plan for data management throughout its lifecycle. Transparency and communication with consumers are essential in data handling. Different types of data require different levels of protection based on risk. Building trust with consumers requires responsible data practices. Organisations need to prioritise basic data protection strategies over compliance with every regulation. Chapters: 00:00 - Introduction to Data Custodianship 03:03 - Understanding Responsibilities in Data Handling 05:59 - Balancing Innovation and Data Protection 08:45 - Building Trust Through Responsible Data Practices 12:07 - Navigating Compliance and Data Governance 14:54 - Leveraging AI for Enhanced Data Custodianship 18:06 - The Role of Humans in Technology and Data Management…
Generative AI and unstructured data are transforming how businesses improve customer experiences and streamline internal processes. As technology evolves, companies find new ways to gain insights, automate tasks, and personalize interactions, unlocking new growth opportunities. The integration of these technologies is reshaping operations, driving efficiency, and enhancing decision-making, helping businesses stay competitive and agile in a rapidly changing landscape. Organizations that embrace these innovations can better adapt to customer needs and market demands, positioning themselves for long-term success. In this episode, Doug Laney speaks to Katrina M. Conn, Senior Practice Director of Data Science at Teradata, and Sri Raghavan, Principal of Data Science and Analytics at AWS, about sustainability efforts and the ethical considerations surrounding AI. Key Takeaways: Generative AI is being integrated into various business solutions. Unstructured data is crucial for enhancing customer experiences. Real-time analytics can improve customer complaint resolution. Sustainability is a key focus in AI resource management. Explainability in AI models is essential for ethical decision-making. The combination of structured and unstructured data enhances insights. AI innovations are making analytics more accessible to users. Trusted AI frameworks are vital for security and governance. Chapters: 00:00 - Introduction to the Partnership and Generative AI 02:50 - Technological Integration and Market Expansion 06:08 - Leveraging Unstructured Data for Insights 08:55 - Innovations in Customer Experience and Internal Processes 11:48 - Sustainability and Resource Optimization in AI 15:08 - Ensuring Ethical AI and Explainability 23:57 - Conclusion and Future Directions…
In this episode, Rachel Thornton, Fivetran's CMO, discusses the highlights of Big Data London 2024, including the launch of Fivetran Hybrid Deployment, which addresses the needs of organisations with mixed IT environments. The conversation delves into integrating AI into business operations, emphasizing the importance of a robust data foundation. Additionally, data security and compliance challenges in the context of GDPR and other regulations are explored. The episode concludes with insights on the benefits of hybrid deployment for organisations. Key Takeaways: Big Data London 2024 is a significant event for data leaders. Fivetran Hybrid Deployment caters to organizations with mixed IT environments. AI integration requires a strong data foundation. Data security and compliance are critical in today's landscape. Organizations must understand their data sources for effective AI use. Hybrid deployment allows for secure data management. Compliance regulations are becoming increasingly stringent. Data readiness is essential for AI integration. Chapters: 00:00 - Introduction to Big Data London 2024 02:46 - Launch of Fibntran Hybrid Deployment 06:06 - Integrating AI into Business Operations 08:54 - Data Security and Compliance Challenges 11:50 - Benefits of Hybrid Deployment…
Managing network traffic efficiently is essential to control cloud costs. Network flow reports are critical in providing detailed insights into data movement across cloud environments. These reports help organisations identify usage patterns, track bandwidth consumption, and uncover inefficiencies that may lead to higher expenses. With a clear understanding of how data flows, businesses can make informed decisions to optimise traffic, reduce unnecessary data transfers, and allocate resources more effectively. This helps lower cloud costs, improves network performance, and enhances security by revealing unusual or potentially harmful traffic patterns. In this episode, Wayne Eckerson from Eckerson Group speaks to Ben Schaechter, CEO of Vantage, about optimising network traffic costs with Vantage’s Network Flow Reports. Key Takeaways: ● Network Flow Reports provide detailed insights into AWS costs. ● They help identify specific resources driving network traffic costs. ● Organisations can reduce costs by up to 90% with proper configuration. ● The shift towards cost management in cloud services is critical. ● FinOps teams are becoming essential for cloud cost optimization. ● Anomaly detection can alert teams to unexpected cost spikes. ● Vantage integrates with multiple cloud providers for comprehensive cost management. ● Effective cost management does not have to impact production workflows. Chapters: 00:00 - Introduction to Vantage and Network Flow Reports 02:52 - Understanding Network Flow Reports and Their Impact 06:09 - Real-World Applications and Case Studies 09:03 - The Shift in Cost Management Focus 11:54 - Tangible Benefits of Implementing Network Flow Reports 15:07 - The Role of FinOps in Cost Optimization 18:00 - Conclusion and Future Insights…
Safe Software’s FME is transforming Omaha’s approach to urban mobility with groundbreaking solutions for asset management, e-scooter tracking, and parking management. FME’s robust data integration capabilities are at the core of Omaha’s advancements. The data integration platform enables real-time tracking of e-scooters, offering precise data on their locations and usage patterns. This innovation enhances the management and accessibility of e-scooters, making urban mobility more efficient and user-friendly. Automated parking management processes, facilitated by FME, streamline city operations and reduce manual efforts. This automation leads to smoother parking experiences for residents and visitors, while dynamic rate adjustments, powered by FME, ensure that parking fees are responsive to real-time demand, optimising availability and revenue. In this episode, Wayne Eckerson from Eckerson Group speaks to Jacob Larson, an Applications Analyst from the City of Omaha, to discuss Omaha’s usage of FME. Key Takeaways: FME helps automate the tracking of e-scooters in real time. Data sharing agreements with providers like Lime enhance tracking capabilities. Omaha's parking management has been transformed through automation. FME allows for dynamic changes in parking rates based on events. The integration of GIS data with third-party APIs is crucial for parking management. Omaha is pioneering a real-time parking information system in the US. Chapters: 00:00 - Introduction to Omaha's Data Initiatives 01:03 - FME's Role in Asset Management 04:53 - Real-Time Tracking of E-Scooters 07:48 - Automating Parking Management 10:06 - Innovations in Parking Availability 12:59 - Dynamic Parking Rate Management…
Safe Software’s FME is transforming Omaha’s approach to urban mobility with groundbreaking solutions for asset management, e-scooter tracking, and parking management. FME’s robust data integration capabilities are at the core of Omaha’s advancements. The data integration platform enables real-time tracking of e-scooters, offering precise data on their locations and usage patterns. This innovation enhances the management and accessibility of e-scooters, making urban mobility more efficient and user-friendly. Automated parking management processes, facilitated by FME, streamline city operations and reduce manual efforts. This automation leads to smoother parking experiences for residents and visitors, while dynamic rate adjustments, powered by FME, ensure that parking fees are responsive to real-time demand, optimising availability and revenue. In this episode, Wayne Eckerson from Eckerson Group speaks to Jacob Larson, an Applications Analyst from the City of Omaha, to discuss Omaha’s usage of FME. Key Takeaways: FME helps automate the tracking of e-scooters in real time. Data sharing agreements with providers like Lime enhance tracking capabilities. Omaha's parking management has been transformed through automation. FME allows for dynamic changes in parking rates based on events. The integration of GIS data with third-party APIs is crucial for parking management. Omaha is pioneering a real-time parking information system in the US. Chapters: 00:00 - Introduction to Omaha's Data Initiatives 01:03 - FME's Role in Asset Management 04:53 - Real-Time Tracking of E-Scooters 07:48 - Automating Parking Management 10:06 - Innovations in Parking Availability 12:59 - Dynamic Parking Rate Management…
Open source technologies are transforming how businesses manage real-time data on cloud platforms. By leveraging flexible, scalable, and cost-effective open-source tools, organisations can process and analyse large volumes of data with speed and precision. These technologies offer unmatched transparency, customisation, and community-driven innovation, making them ideal for real-time monitoring, analytics, and IoT applications. As data demands grow, open-source solutions ensure that businesses stay agile, reduce vendor lock-in, and maintain full control over their cloud infrastructure. The result? Faster insights, smarter decision-making, and enhanced performance—all powered by open source. In this episode, Paulina Rios Maya, Head of Industry Relations at EM360Tech, speaks to Mikhail Epikhin, Chief Technology Officer at Double Cloud, about The Power of Open Source in Cloud Platforms. Key Takeaways: Open-source technologies provide standard building blocks for products. Community-driven innovation is essential for the evolution of technology. Flexibility in data infrastructure is crucial for real-time processing. Observability and monitoring are vital for performance optimisation. Managed services can accelerate product development and feature implementation. Chapters: 00:00 - The Power of Open Source in Cloud Platforms 05:24 - Apache Airflow: Enhancing Real-Time Data Management 10:08 - Balancing Open Source and Managed Services 13:57 - Best Practices for Scalability and Performance…
Big Data LDN 2024, the UK’s leading data, analytics, and AI event, is less than a week away – promising two days filled with ground-breaking stories, expert insights, and endless innovation. Taking place at the Kensington Olympia in London on September 18-19, this year’s event features fifteen theatres and over 300 expert speakers sharing insights on some of the industry’s hottest topics – from generative AI to data analytics and privacy. With the event less than a week away, EM360Tech’s Head of Podcast Production, Paulina Rios Maya, grabbed Big Data LDN’s Event Director, Andy Steed , for a chat about his expectations for this year’s event and its growing importance in the data world. In the episode, they discuss: The exciting themes or breakthroughs attendees can expect to see showcased this year How Big Data London remains relevant in such a rapidly evolving field The unique networking opportunities or interactive experiences attendees have at the conference The standout sessions or keynote speakers at the conference Chapters: 00:00: Introduction to Big Data LDN 2024 01:35: Showcasing Data Stories, Transformations, and Challenges 02:33: The Networking Opportunities with Industry Leaders and Peers at Big Data LDN 2024 05:01: Staying Relevant with a Focus on Generative AI and Real-World Use Cases 06:55: The Importance of Data Events for Community Building and Learning About Big Data LDN 2024 Big Data London is the UK's largest data and analytics event, attracting over 16,500 visitors each year. Taking place at the Olympia in London on September 18-19, this year’s event features fifteen theatres and over 300 expert speakers across the two-day conference. Attendees can meet face-to-face with tech providers and consultants to find solutions to your data challenges and view the latest product releases and software demos to enhance your business' data capabilities. It’s also a great opportunity for attendees to strengthen their business network with new and existing partners, and immerse themselves within the data community and network with speakers, colleagues and practicioners all in 2 days at Big Data LDN. Sign up to Big Data LDN 2024 today and be part of the UK’s leading hub for the Data Community to learn and share best practices, build relationships and find the tools needed to develop an effective data-driven business. We can’t wait to see you there!…
Sustainable sourcing is essential for businesses committed to environmental and social responsibility, but achieving it requires accurate and reliable data. Master Data Management (MDM) ensures that all sourcing data—such as supplier information, certifications, and compliance records—is consistent and up-to-date. This enables organisations to make informed decisions that align with their sustainability goals, reduce waste, and promote ethical practices throughout their supply chain. MDM is the foundation of a successful sustainability strategy. By providing a single source of truth for all critical data, MDM helps businesses monitor and track their sustainability efforts effectively. With accurate data, companies can identify opportunities to improve resource efficiency, reduce carbon footprints, and ensure compliance with environmental standards, ultimately leading to a more sustainable and resilient business model. In this episode, George Firican, Founder of LightsOnData, speaks to Matthew Cawsey, Director of Product Marketing and Solution Strategy, and Paarijat Bose, Customer Success Manager at Stibo Systems, to discuss sustainable sourcing and why accurate data matters. Key Takeaways: Sustainable sourcing involves understanding the provenance and environmental impact of products, ensuring compliance with regulations, and meeting sustainability goals. Data completeness and accuracy are crucial in meeting regulatory requirements and avoiding issues like greenwashing. Managing sustainability data requires a solid foundation of MDM to ensure data accuracy, stewardship, and semantic consistency. MDM solutions help companies collect, manage, and share sustainability data, enabling them to meet compliance requirements and achieve their sustainability goals. Chapters: 00:00 - Introduction and Overview 01:07 - The Challenge of Collecting Data for Compliance and Reporting 02:31 - Data Accuracy and Completeness in the Supply Chain 05:23 - Regulations and the Demand for Transparent and Complete Data 08:41 - The Role of Master Data Management in Sustainability 15:51 - How Data Management Technology Solutions Help Achieve Sustainability Goals 21:02 - The Need to Start Early and Engage with Data Management Solutions 22:01 - Conclusion and Call to Action…
Data provenance is essential for maintaining trust and integrity in data management. It involves tracking the origin of data and understanding how it has been processed and handled over time. By focusing on fundamental principles such as identity, timestamps, and the content of the data, organisations can ensure that their data remains accurate, consistent, and reliable. Implementing data provenance does not require significant changes or large investments. Existing technologies and techniques can be seamlessly integrated to provide greater transparency and control over data. With data provenance, businesses can confidently manage their data, enhancing decision-making and fostering stakeholder trust. In this episode, Jon Geater, Co-Chair of the Supply Chain Integrity Transparency and Trust (SCITT) Working Group, speaks to Paulina Rios Maya, Head of Industry Relations, about data provenance. Key Takeaways: Data provenance is knowing where data comes from and how it has been handled, ensuring trust and integrity. The fundamental principles of data provenance include identity, timestamps, and the content of the data. Data provenance can be implemented by integrating existing technologies and techniques without significant changes or investments. Data provenance helps with compliance, such as GDPR, by providing a transparent record of data handling and demonstrating compliance with requests. Chapters: 00:00 - Introduction and Background 02:01 - Understanding Data Provenance 05:47 - Implementing Data Provenance 10:01 - Data Provenance and Compliance 13:50 - Success Stories and Industry Applications 18:10 - Conclusion and Call to Action…
FME is a vital tool in disaster management and response. It enables the integration and transformation of geospatial data for real-time tracking of disasters and hazards. By ensuring accurate and timely data analysis, it provides essential decision support for disaster management professionals. During the Maui wildfires, FME and the Pacific Disaster Centre were crucial in managing and analysing critical data, allowing for effective coordination and response. By facilitating seamless data sharing and collaboration among stakeholders, FME helps ensure that the correct information reaches the right people at the right time. In this episode of the EM360 Podcast, Alejandro Leal, an Analyst at KuppingerCole, speaks to Jorma Rodieck, a GIS Specialist at the Pacific Disaster Centre, about the importance of FME. Key Takeaways: FME is an essential tool in disaster management and response, allowing for the integration and transformation of geospatial data. FME enables real-time data analysis and decision support for disaster management professionals. During the Maui wildfires, FME was instrumental in managing and analyzing critical data, providing a common operating picture for response efforts. FME ensures effective data sharing and collaboration among various stakeholders, enabling smooth interoperability between departments and agencies. Chapters: 00:00 - Introduction and Background 02:35 - The Role of FME in Disaster Management 06:44 - Managing and Analyzing Critical Data with FME 10:34 - FME's Impact during the Maui Wildfires 11:59 - Ensuring Effective Data Sharing and Collaboration 15:20 - The Future of FME in the Pacific Disaster Center 18:15 - Conclusion…
Open source real-time analytics offers unparalleled advantages, providing businesses with freedom and independence to maintain operations seamlessly, even if a vendor issue arises. However, the journey isn't without its challenges. Open source solutions can often be clunky and require specialised expertise to manage effectively. This is where DoubleCloud comes in, offering a managed platform that addresses these obstacles by handling crucial responsibilities such as backups, high availability, and security updates, allowing businesses to focus on leveraging their data. In this podcast, Christina Stathopoulos speaks to Vladimir Borodin, Co-Founder and CEO of DoubleCloud, about open source strategies and the advantages of the DoubleCloud solution. Key Takeaways: DoubleCloud's managed platform helps overcome the challenges of open source, such as clunkiness and a lack of expertise. Successful customer use cases demonstrate the performance and cost benefits of DoubleCloud's solution. The transition phase to DoubleCloud's solution depends on the complexity of the application. Using open source whenever possible is recommended. Chapters: 00:00 - Introduction and Background 02:29 - The Advantages of Open Source 04:21 - Challenges of Open Source 06:47 - The Power of Real-Time Analytics 09:11 - Success Stories: Improved Performance and Reduced Costs 12:54 - Navigating the Transition to DoubleCloud's Solution 15:14 - The Importance of Using Open Source…
Privacy by Default and Design is a fundamental principle of the General Data Protection Regulation (GDPR). It prioritises transparency, user control, and data security from the outset. This approach ensures that privacy is integrated into systems and processes by default rather than as an afterthought. By embedding these practices, organisations enhance trust and accountability while meeting regulatory requirements. However, challenges such as resistance to change and the need for cultural transformation must be addressed to implement this principle effectively. In this episode of the Don’t Panic It’s Just Data, Tudor Galos, Senior Privacy Consultant, speaks to Paulina Rios Maya, Head of Industry Relations, about the impact of privacy by default and design extend to user experience, where issues like consent fatigue and the necessity for user-friendly interfaces arise. Key Takeaways: Organisations face challenges in implementing privacy by default and design, including resistance to change and the need for cultural transformation. Privacy by default and design impact user experience, with issues like consent fatigue and the need for user-friendly interfaces. Regulations like GDPR and CCPA incorporate privacy by default and design principles, emphasising compliance and accountability. Chapters: 00:00 - Introduction and Overview 01:00 - Core Principles of Privacy by Default and Design 02:19 - Difference from Traditional Privacy Practices 04:09 - Challenges in Implementing Privacy by Default and Design 05:33 - Impact of Privacy by Default on User Experience 08:14 - Alignment of Privacy by Default with Regulations 09:04 - Ensuring Compliance and Trust 11:24 - Implications of Emerging Technologies on Privacy 13:15 - Innovations in Privacy-Enhancing Technologies 15:50 - Conclusion…
Safe Software's Feature Manipulation Engine (FME) plays a pivotal role in the City of Fremont's operations, particularly in ensuring accurate and efficient data submissions under the Racial and Identity Profiling Act (RIPA). By automating complex workflows and enhancing data quality, FME not only ensures seamless compliance with RIPA requirements but also optimises processes for their ITS and GIS divisions. FME also drives innovation in projects like the DroneSense programme and their Cityworks asset management integration. With seamless data integration and powerful visualisations, FME empowers the City of Fremont to enhance operations, improve asset management, and support informed decision-making. In this episode, Jonathan Reichental, founder at Human Future, speaks to John Leon, GIS Manager for the City of Fremont, to discuss: FME RIPA Public Safety Chapters: 00:00 - Introduction and Overview of the City of Fremont and IT/GIS Division 03:01 - Explanation of the Racial and Identity Profiling Act (RIPA) 04:27 - Challenges in Meeting RIPA Standards and Utilizing FME 06:21 - How FME Ensures Error-Free RIPA Data Submissions 09:40 - Benefits of Using FME for RIPA Compliance 10:39 - Other Innovative Projects Utilizing FME in the City of Fremont 13:30 - Future Plans for FME in the City of Fremont 17:17 - Recommendations for Government Agencies: Leverage FME for Data Submissions…
Real-time data insights help identify performance bottlenecks, manage data efficiently, and drive innovation. Despite the growing need for these capabilities, organisations often face challenges in implementing effective real-time analytics. Achieving high-concurrency data processing is crucial for overcoming performance bottlenecks in real-time analytics. Embracing real-time analytics is not just a necessity, but a way to transform your data into actionable insights, optimise performance, and fuel business growth. Yellowbrick is a modern data platform built on Kubernetes for enterprise data warehousing, ad-hoc and streaming analytics, AI and BI workloads that ensures comprehensive data security, unparalleled flexibility, and high performance. In this podcast, Doug Laney, a Data Strategy Innovation Fellow with West Monroe, speaks to Mark Cusack, the CTO of Yellowbrick, about the power of real-time analytics. Key Takeaways: Real-time analytics enables faster business decisions based on up-to-date data and focuses on enabling actions. Using a SQL data platform like Yellowbrick, designed for high-concurrency data processing, can address performance bottlenecks in real-time analytics. Chapters: 00:00 - Introduction and Overview 01:07 - The Benefits of Real-Time Analytics 06:23 - Overcoming Challenges in Implementing Real-Time Analytics 06:51 - High Concurrency Data Processing for Real-Time Analytics 13:59 - Yellowbrick: A Secure and Efficient SQL Data Platform…
Accurate and reliable data is essential for training effective AI models. High-quality data ensures precision, reduces bias, and builds trust in AI systems. Similarly, Master Data Management (MDM) systems enhance data quality by integrating data from multiple sources, enforcing data governance, and providing a single source of truth. This helps eliminate discrepancies and maintain data integrity. Integrating Product Information Management (PIM) with MDM ensures accurate and consistent product data across all channels, crucial for data-driven marketing. This combination centralises customer and product data, enabling precise targeting and personalised experiences. MDM and PIM integration leads to higher ROI and improved customer satisfaction by supporting effective marketing strategies. In this episode of the EM360 Podcast, Paulina Rios Maya speaks to Philipp Krueger about integrating PIM and MDM functionalities and how it streamlines operations, improves data accuracy and supports data-driven marketing strategies. Chapters 00:00 - Introduction and Importance of Data Quality in AI Models 05:27 - Core Capabilities of an MDM System 08:13 - The Role of Data Governance in Data Management 13:37 - Enhancing Customer Experience and Driving Sales with Pimcore 19:47 - Integration of PIM and MDM Functionalities for Data-Driven Marketing Strategies 22:59 - The Impact of Accurate Data on Revenue Growth 27:28 - Simplifying Data Management with a Single Platform…
One of the biggest challenges businesses face when it comes to data visualisation is handling the volume of data and the need for faster processing methods. There's a common misconception that effective data visualisation must be fancy and interactive, but simple visuals can be just as powerful. Ann K. Emery, an expert in the field, believes that accessibility doesn't have to be time-consuming or expensive. In this podcast, she shares actionable strategies for creating accessible visualizations with Paulina Rios Maya, Head of Industry Relations at EM360Tech. Key Takeaways Avoiding red-green colour combinations, Ensuring proper colour contrast Using direct labelling instead of legends. Avoiding using all-caps Using grey to highlight important information Employing small multiples to simplify complex visualisations. Chapters: 00:00 - Introduction 00:54 - Defining Accessibility in Data Visualization 02:17 - Big A Accessibility Tips 06:36 - Little a Accessibility Strategies 12:28 - The Future of Data Accessibility…
Managing cloud costs effectively has become a significant challenge for organisations relying on public cloud services. FinOps addresses these challenges by ensuring efficient spending and governance of cloud resources. Key practices in FinOps include achieving complete visibility into cloud usage and costs, fostering cross-functional collaboration between finance, operations, and engineering teams, and utilising data-driven decision-making to optimise cloud investments. By embracing a centralised team, organisations can instil a culture of governance and efficiency in cloud cost management. This approach can lead to enhanced resource utilisation and substantial cost savings. With Vantage, your organisation can cultivate a robust cloud cost governance and efficiency culture, ensuring your cloud investments yield maximum value. In this episode of the EM360 Podcast, Kevin Petrie, VP of research at BARC US, speaks to Ben Schaechter, CEO and co-founder of Vantage, to discuss: FinOps Vantage’s platform Cloud costs and FinOps practices Chapters 00:00 - Introduction and Overview 02:02 - Understanding FinOps and Cloud Cost Governance 07:45 - Best Practices in FinOps: Centralization and Collaboration 13:50 - The Role of Data-Driven Insights in Optimizing Cloud Costs…
Managing large volumes of data in the context of AI and machine learning applications presents challenges related to data quality, data preparation, and automation. The requirements of data management are changing with the advent of generative AI, requiring more flexibility and the ability to handle larger volumes of data. Pimcore leverages AI and machine learning to automate data utilization and improve data intelligence. By streamlining data management and integrating various data sources, Pimcore drives revenue growth for its customers. The platform combines data management and experience management to deliver personalized data across communication channels. Pimcore’s MDM solution addresses the challenges of integrating data for both human and machine consumption. The choice between physical and virtual MDM hubs depends on the use case and industry. In this episode of the EM360 Podcast, Doug Laney , Data and Analytics Strategy Innovation Fellow at West Monroe speaks to Dietmar Rietsch , Managing Director and Co-Founder of Pimcore , to discuss: Data management AI Machine learning Data quality…
Maximising data relationships through text analytics, particularly with tools like LLMS and Knowledge Graphs, offers organisations unprecedented insights and capabilities. By leveraging these advanced technologies, businesses can unlock hidden connections and patterns within their data, leading to more informed decision-making and strategic planning. Integrating Ontotext's solutions is a game-changer, empowering organisations to extract, organise, and visualise complex information from unstructured data sources. With Ontotext's expertise in semantic technology, businesses can construct robust knowledge graphs that offer a comprehensive understanding of their data landscape. This comprehensive approach not only facilitates better analysis and interpretation of data but also ignites innovation and propels business growth in today's increasingly data-driven world. In this episode of the EM360 Podcast, Paulina Rios Maya, Head of Industry Relations, speaks to Doug Kimball, Chief Marketing Officer at Ontotext, to discuss: AI in Enterprise Knowledge LLMs Knowledge Graphs Chapters 00:00 - Challenges of Integrating LLMs into Enterprise Knowledge Management Systems 04:35 - Enhancing Compatibility and Efficacy with Knowledge Graphs 07:21 - Innovative Strategies for Integrating LLMs into Knowledge Management Frameworks 11:07 - The Future of LLM-Driven Knowledge Management Systems: Intelligent Question Answering and Insight Enablement…
Managing cloud computing costs is a pressing challenge faced by organisations of all sizes across industries. As businesses increasingly migrate their operations to the cloud, the complexity of managing and optimizing costs grows exponentially. Without proper oversight and strategy, cloud expenses can quickly spiral out of control, leading to budget overruns and financial inefficiencies. Vantage addresses this issue head-on by providing organizations with a powerful platform equipped with automated cost recommendations, customizable reports, and real-time monitoring capabilities. By leveraging advanced analytics and machine learning, Vantage empowers teams to gain unparalleled visibility into their cloud spending and make informed decisions to optimize costs. In this episode of the EM360 Podcast, Dana Gardner , Principal Analyst at Interarbor Solutions speaks to Ben Schaechter , CEO and Co-founder of Vantage , to discuss: Cloud cost management FinOps Cost optimization Automated cost recommendations…
Ensuring the reliability and effectiveness of AI systems remains a significant challenge. Generative AI must be combined with access to your company data in most use cases, a process called retrieval-augmented generation (RAG). The results from GenerativeAI are vastly improved when the model is enhanced with contextual data from your organization. Most practitioners rely on vector embeddings to surface content based on semantic similarity. While this can be a great step forward, achieving good quality requires a combination of multiple vectors with text and structured data, using machine learning to make final decisions. Vespa.ai, a leading player in the field, enables solutions that do this while keeping latencies suitable for end users, at any scale. In this episode of the EM360 Podcast, Kevin Petrie , VP of research at BARC US speaks to Jon Bratseth , CEO of Vespa.ai , to discuss: the opportunity for Generative AI in business why you need more than vectors to achieve high quality in real systems how to create high-quality GenerativeAI solutions at an enterprise scale…
Geographic Information Systems (GIS) have transformed urban landscape analysis and government policy creation, albeit not without challenges. In the past, GIS analysts often visited locations to piece together information physically. With the help of cutting-edge platforms like Safe Software’s FME, cities like Burnaby, British Columbia, have revolutionised their operations. This has led to a significant enhancement in the quality of life for its residents. From predictive modelling to real-time data analysis, the potential for innovation appears boundless, underscoring the importance of GIS technology in improving urban operations. In this episode of the EM360 Podcast, Wayne Eckerson speaks to Herman Louie , GIS Analyst at the City of Burnaby , to discuss: Design and implementation of GIS solutions Safe Software’s FME platform Transition to NG9-1-1 The future of GIS…
Government organisations face a multitude of challenges when it comes to managing their data effectively. From interoperability issues between systems to the need for seamless collaboration across agencies, the complexity can be overwhelming. Safe Software's FME platform offers a comprehensive solution to these challenges by providing a flexible and intuitive data integration platform tailored to the unique needs of government agencies. With FME, government organisations can overcome the barriers that hinder efficient data management. FME enables streamlined operations and improved decision-making processes by seamlessly connecting disparate systems and applications. Whether it's digital plan submissions, emergency services coordination, or interagency health data sharing, FME empowers government agencies to achieve their data integration goals with ease. In this episode of the EM360 Podcast, Doug Laney , Data and Analytics Strategy Innovation Fellow at West Monroe speaks to Tom Seymour , Government Sales Team Lead at Safe Software , to discuss: Data integration and interoperability Safe Software’s FME platform FME in governments Advantages of FME ROI with FME…
Ever wonder how search engines understand the difference between "apple," the fruit, and the tech company? It's all thanks to knowledge graphs! These robust and scalable databases map real-world entities and link them together based on their relationships. Imagine a giant web of information where everything is connected and easy to find. Knowledge graphs are revolutionizing how computers understand and process information, making it richer and more relevant to our needs. Ontotext is a leading provider of knowledge graph technology, offering a powerful platform to build, manage, and utilise knowledge graphs for your specific needs. Whether you're looking to enhance search capabilities, improve data analysis, or unlock new insights, Ontotext can help you leverage the power of connected information. In this episode of the EM360 Podcast, George Firican , Founder of LightsOnData, speaks to Sumit Pal , Strategic Technology Director at Ontotext , to discuss: Knowledge Graphs Use Cases Ontotext GraphDB Integration of AI Industry best practices…
The traditional data warehousing landscape is changing. The concept of private data cloud offers a compelling alternative to both cloud PaaS and traditional data warehousing. Imagine a secure, dedicated environment for your data, existing entirely within your organisation's control. Yellowbrick, a leader in private data cloud solutions, empowers businesses to leverage their data on their terms. Their Bring Your Own Cloud (BYOC) approach offers unmatched flexibility and control. You can deploy Yellowbrick anywhere your data needs to be—public cloud, private cloud, or even the network edge. This ensures compliance with regulations and keeps your data exactly where you want it and can bring down costs. In this episode of the EM360 Podcast, Wayne Eckerson , President of Eckerson Group, speaks to Mark Cusack , Chief Technology Officer of Yellowbrick , to discuss: The need for hybrid multi-cloud data platforms How Yellowbrick differentiates The future of private data cloud Why Yellowbrick?…
The data analysis landscape is on the precipice of a paradigm shift. Generative AI (GenAI) promises revolutionary insights, but traditional systems struggle to feed its insatiable appetite for well-structured data. Imagine GenAI as a high-powered engine – it needs meticulously organised fuel to reach its full potential. In this episode of the EM360 Podcast, Analyst Christina Stathopoulos guides Deborah Leff (CRO at SQream ) and Jason Hardy (CTO, AI at Hitachi Vantara ) as they dissect: The Bottlenecks of Traditional Systems Unlocking GPU Potential Fueling Insights with Structured Data Faster Time to Insights This episode goes beyond theory, exploring real-world examples (like a company querying a staggering 64 quadrillion rows) It showcases the potential for SQream and Hitachi Vantara to empower organisations to make data-driven decisions with unprecedented speed and accuracy.…
Cloud-native is the new gold rush for businesses seeking speed, efficiency, and innovation. But are you getting the most out of your investment? Legacy troubleshooting and observability tools can be hidden anchors, dragging down your developers' productivity. The result? You're not reaping the full benefits of cloud-native, and your competitors are leaving you in the dust. Chronosphere, a leader in modern observability, can empower your developers and unlock the true potential of cloud-native. Buckle up and get ready to discover how observability can become your secret weapon for unleashing developer agility and innovation. In this episode of the EM360 Podcast, VP of Research at Eckerson Group, Kevin Petrie speaks to Ian Smith , Field CTO at Chronosphere , to discuss: Observability Cloud-native software developers Developer inefficiency The Chronosphere solution Data pipeline best practices…
Intelligent document processing (IDP) is a technology-driven approach that automates document processing and extraction of valuable information. While handling structured data is considerably straightforward, processing and analyzing unstructured data is laborious. IDP equips users with the ability to process a multitude of document types, including PDFs, spreadsheets, and Word documents, among others. IDP platforms offer a powerful solution that streamlines data extraction from these documents by eliminating the need for any manual intervention. The extracted data, when integrated, enables you to make reliable decisions and improve business efficiency. In this episode of the EM360 Podcast, Analyst Christina Stathopoulos speaks to Jay Mishra , COO at Astera , to discuss: Traditional document processing and the problems it causes AI-powered solutions The future of document processing and data extraction Find out more about Astera’s unified, no-code data management platform here .…
Rainfall. Temperature. Humidity. Natural disasters. Human methods of reading and predicting weather can be tracked all the way back to native tribes and ancient civilisations - and it’s still prevalent in the modern world today. Whether (no pun intended) it’s an agricultural organisation looking to leverage precipitation data for planting schedules, energy companies looking at temperature trends to predict energy consumption patterns or transportation outfits looking to avoid delays, mastering weather data can really help modern companies to protect themselves against the unknown. In this episode of the EM360 Podcast, Analyst Susan Walsh speaks to Christian Schluchter , CTO at Meteomatics , to discuss: Key use cases of weather data Mastering data access and predictive analytics The importance of being data driven…
Maximize business value with data products — they incorporate essential data and related capabilities to meet key business objectives. Data products contain datasets, related metadata and a wide range of functionality to understand if data is fit for use. But unlike relying on a raw dataset or data pipeline to generate value, data products deliver a comprehensive, packaged solution that enables data users to achieve their data-driven goals with easier accessibility. In this EM360 Podcast episode, Head of Content Matt Harris sits down with Nathan Turajski , Senior Director of Product Marketing at Informatica . They explore how to: Drive better business outcomes with data products Understand the benefits of user-friendly and efficient consumption Improve AI, analytics and customer experience initiatives…
플레이어 FM에 오신것을 환영합니다!
플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.