Over the years, many of us working in the Oracle analytics space have helped customers implement Oracle Business Intelligence Applications (OBIA) - a powerful solution in its time, offering prebuilt analytics across ERP, HCM and more. If you ever spent hours managing DAC, tweaking ETL mappings, or retrofitting OBIA customisations after a patch - you’ll understand why Fusion Data Intelligence feels like Oracle finally got analytics right.But let’s be honest: it had its fair share of complexity, rigidity, and technical debt. Fast-forward to today and we’ve entered a new era with Oracle Fusion Data Intelligence (FDI) - a reimagined, cloud-native analytics platform designed from the ground up for the Fusion SaaS landscape. And if you’ve ever battled with OBIA’s extensibility, upgrade cycles or data latency, FDI is likely to feel like a breath of fresh air. This post is the first in a short series unpacking what FDI actually is, how it compares with its predecessors, and what it means for Fusion customers today. Oracle's recent growth Over the past 2–3 years, Oracle has consistently grown its cloud business, with total revenue rising from $40.5 billion in FY2022 to $57.4 billion in FY2025, driven largely by strong momentum in Fusion Cloud Applications, NetSuite, and OCI (Oracle Cloud Infrastructure). While Oracle doesn’t match the scale of hyperscalers like AWS or Microsoft Azure in infrastructure alone, its distinct advantage lies in its full-stack strategy - uniquely offering enterprise SaaS, infrastructure, and the database layer under one roof. This vertically integrated model means Oracle can optimise performance, security, and cost across its stack, especially for Fusion workloads. Competitors like SAP and Workday lead in applications but lack native cloud infrastructure; AWS and Azure dominate infrastructure but rely on third-party SaaS partners. Oracle, by contrast, continues to blur the lines between application and platform, using technologies like Autonomous Database, OCI Gen2, and now Fusion Data Intelligence to deliver insights that are deeply embedded, secure, and performant - all within its own ecosystem. These figures aren’t just impressive - they’re a strong signal that Oracle’s SaaS portfolio is achieving scale and maturity, particularly in core enterprise functions like Finance, HR, and Operations. Fusion ERP alone has grown from $0.9B to $1.0B in quarterly revenue, underscoring widespread enterprise adoption. From Adoption to Insight: The Next Frontier As organisations continue investing in Oracle Fusion Cloud applications, the expectation isn’t just automation - it’s intelligence. Businesses aren’t content with simply moving transactional processes to the cloud; they want to understand the return on those investments, monitor performance in real time, and use their data to make faster, smarter decisions. This is where Fusion Data Intelligence (FDI) steps in. Just as Oracle’s adoption of Fusion SaaS pillars is accelerating, so too is the demand for embedded, governed, cross-functional insights that empower users in the flow of work. With SaaS platforms becoming the new systems of record, the analytics layer must evolve in lockstep - and be natively integrated, secure, and scalable. FDI is that evolution. Why FDI Matters Now More Than Ever
FDI bridges this critical gap by turning raw operational data into actionable intelligence - all while aligning with the Fusion application security model, lifecycle, and extensibility standards.
Looking Back: OBIA Was Revolutionary — But the World Has Moved On When it launched, Oracle Business Intelligence Applications (OBIA) was genuinely ahead of its time. Prebuilt subject areas, KPI dashboards, and ETL pipelines for ERP, HCM, SCM, and CRM systems allowed organisations to fast-track enterprise reporting without starting from scratch. OBIA gave business users actionable insights over operational systems, and it helped many enterprises move beyond siloed spreadsheets into a more governed BI model. But OBIA came with constraints that, over time, became significant limitations:
The Modern Alternative: Fusion Data Intelligence With Fusion Data Intelligence (FDI), Oracle has reimagined what enterprise application analytics should look like in the cloud era.
From OBIA to OAX to FAW to FDI: An Analytics Evolution FDI didn’t appear out of nowhere - it’s the result of five years of iterative development across multiple product identities. It began as Oracle Analytics for Applications (OAX), introduced around 2019 as a cloud-based successor to OBIA. OAX was designed to deliver prebuilt analytics for Oracle Fusion Cloud Applications, leveraging Oracle Autonomous Data Warehouse and Oracle Analytics Cloud. In 2020, OAX was rebranded as Fusion Analytics Warehouse (FAW), marking a shift toward a more unified, extensible platform. FAW introduced modular “pillars” aligned with business domains--ERP, HCM, SCM, and CX—each offering curated data models, semantic layers, and prebuilt KPIs. Over the next few years, Oracle expanded these pillars with hundreds of subject areas and embedded machine learning for predictive insights. In 2024, FAW was renamed Fusion Data Intelligence (FDI). This rebranding emphasized its broader mission: not just warehousing analytics, but enabling intelligent decision-making across the enterprise. FDI retained the core architecture—Autonomous Data Warehouse, Oracle Analytics Cloud, and managed pipelines—but added enhanced extensibility, data sharing capabilities, and a more intuitive console for governance and customisation. In short, where OBIA was revolutionary for the on-prem era, FDI is purpose-built for the cloud-native enterprise. It meets today’s expectations for agility, integration, governance, and intelligence - without the baggage of yesterday’s architecture. Looking Ahead
This post was just the beginning. Over the next few instalments, we’ll dive deeper into the nuts and bolts of Fusion Data Intelligence - from how it handles extensibility and embedded insights, to what it means for Fusion customers trying to move beyond dashboards and into decision intelligence. FDI represents more than just a new analytics tool - it’s a shift in how Oracle customers can extract value from their SaaS investments. If you’ve ever found yourself battling data silos, struggling with upgrades, or explaining to stakeholders why reporting still takes days, this series is for you. Stay tuned.
0 Comments
When we think about business data, we usually picture tidy tables and dashboards neatly populated with structured relational data. But in reality, much of an organisation’s most valuable information lives in unstructured formats—scanned invoices, PDFs, handwritten notes, and contracts. This data is often locked away in silos, disconnected from the wider analytical ecosystem.
Oracle Analytics’ AI Document Understanding feature changes that. It enables organisations to automatically extract structured data from documents stored in OCI Object Storage using pretrained AI models—all without needing a data science team. With this capability, you can enrich dashboards with data that would previously be too costly or complex to access. In this post, we’ll walk through:
What Is Oracle Analytics AI Document Understanding?
At its core, the AI Document Understanding capability in Oracle Analytics leverages AI models (deployed within Oracle Cloud Infrastructure) to parse and extract fields of interest from documents stored in OCI Object Storage. This is particularly powerful for automating workflows that currently depend on manual data entry or semi-structured file formats. It supports a range of document types and layouts, including:
IAM Policies
To enable Oracle Analytics to securely access documents stored in OCI Object Storage and to invoke AI services like Document Understanding, specific IAM policies must be in place. Without these policies, your OAC instance won’t have the necessary permissions to read documents or trigger AI model processing. In this section, we’ll walk through the exact tenancy- and compartment-level policies required, ensuring your setup is both functional and secure. You can find more information here.
The following IAM policies grant Oracle Analytics the necessary permissions to read from your Object Storage bucket and to invoke the AI Document Understanding service.
Compartment level IAM Policy
Notes
Next policy needs to be defined at the root compartment level
Root level IAM Policy
These policies are necessary to enable Oracle Analytics to access the OCI AI Document Understanding model. Without these policies correctly setup, you will encounter errors when you attempt to run your data flow in Oracle Analytics.
With the IAM policies configured, you can now proceed with setting up the connection and registering the model within Oracle Analytics.
You do this by creating an Oracle Analytics connection to your Oracle Cloud Infrastructure tenancy that will enable you to gain access to your OCI Object Storage Bucket.
Register a pre-trained Document Key Value Extraction model with your Oracle Analytics instance ensuring that the bucket created previously is selected.
This completes all prerequisites and the next step is to run the newly registered pre-trained model in Oracle Analytics by creating a data flow.
The next step is to create a create a "dataset" which is used as an input to the data flow. This dataset is a CSV file that contains the OCI object storage bucket URL where the documents have been uploaded to. This CSV file can either contain a row for each document with a URL for each document that you intend to process or a single row with a URL for the bucket itself. This way every document within this bucket will be processed. Personally, it's a no brainer for me to use the second option. As mentioned earlier in this article, you need to derive the bucket URL by logging on to the OCI console's bucket details page and copying the URL from your browser. You can see a sample below that has 2 tabs; the 1st tab is what you would use for option 1 where you list out your documents with their corresponding URLs. The 2nd tab has a single row and this is what you would use to instruct the data flow to process all documents within the specified bucket. ![]()
Follow the instructions here to create your data flow.
Using the Apply AI Model step, you make a call the the registered pretrained AI Document Understanding model. You then add a Save Data step in which you specify the output dataset. In my example below, I have a few Transform Column steps which are being used to execute some transformations to some columns.
Once the data flow has been saved, it can be run to generate the output dataset. You can see a sample data visualisation workbook below based on the output dataset with some insights of the information derived from the invoices.
Tips and tricks for working with unstructured data in Oracle Analytics
Working with unstructured documents—especially at scale—introduces its own set of quirks. Here are some practical insights to help you get the most out of the AI Document Understanding feature in Oracle Analytics: Use Document Batching Strategically Oracle Analytics currently imposes a 10,000-row processing limit per run. If you’re working with high volumes:
Reuse and Schedule Data FlowsOnce you’ve built a data flow that works, save it and schedule it to run regularly:
Start Small, Then Scale Try a proof-of-concept with 10–20 documents first:
Gotchas, Limits and Tips
1. Bucket URL Must Be Copied from Browser The most confusing part of this setup is finding the correct OCI Object Storage bucket URL. It’s not visible anywhere in the console UI—you must copy it from the bucket’s detail page URL in your browser. 2. 10,000 Document Row Limit There’s a hard limit of 10,000 document rows per data flow run. If your use case involves large volumes of documents, you’ll need to split your data or automate batch runs accordingly. Note that this limit is even less when a custom model is used. The limit in this scenario is 2,000 documents. 3. Document Layouts Matter The AI model is pre-trained for certain layouts (e.g. invoices, forms). Custom layouts may yield mixed results, and you may need to experiment with field mappings to improve outcomes. 4. Use Tags for Traceability Tag your buckets and policies in OCI with labels like oac-ai-docs so they’re easier to audit and maintain. Conclusion Oracle Analytics’ AI Document Understanding feature bridges a crucial gap between unstructured documents and visual analytics. With a few setup steps—bucket creation, IAM policy configuration, model registration, and a simple data flow—you can surface hidden insights from documents that would otherwise sit untouched. It’s a powerful tool, but one with nuances—such as the hidden bucket URL and processing limits—that are worth planning for. Still, for anyone looking to extend their analytics to the edges of their data estate, this capability opens the door. Oracle Analytics now makes it possible to integrate scanned documents, invoices, and other unstructured data sources directly into your dashboards—unlocking insights that were previously out of reach. Optimising Performance in Oracle Analytics Cloud: A Deep Dive into Extracted Data Access Mode10/5/2025
The May 2025 update to Oracle Analytics Cloud (OAC) introduces a significant new feature designed to boost performance and reduce dependency on source systems: the Extracted data access mode. This new capability is especially valuable for enterprise users seeking to optimise dashboard responsiveness, reduce backend load, and deliver consistent performance across a variety of usage scenarios. In this expanded post, we’ll delve into what Extracted mode brings to the table, compare it with the existing Live and Cached modes, and offer guidance on how to get the most value from it.
Understanding Data Access Modes in Oracle Analytics Cloud
To fully appreciate the advantages of the new Extracted mode, it helps to revisit the existing data access modes in Oracle Analytics Cloud — namely Live and Cached. Each mode supports different use cases, with varying implications for data freshness, system performance, and architectural complexity. Live Mode In Live mode, Oracle Analytics executes every query directly against the source system in real time. Whether a user is exploring a dashboard, applying filters, or drilling into data, each action sends a query to the backend database. Advantages:
Cached mode creates a temporary local copy of query results within OAC’s cache layer. This cache is generated on-the-fly when users first load a dashboard or perform a query and reused in subsequent interactions where applicable. Advantages:
Introducing: Extracted Mode (New in May 2025)
The newly introduced Extracted mode provides a more structured and predictable alternative. It allows dataset creators to perform a full extract of data from a source system and store that extract directly within Oracle Analytics. Unlike Cached mode, this data snapshot is proactively managed and completely reusable. Key Benefits of Extracted Mode:
Comparison Table: Live vs Cached vs Extracted Mode
Cached vs Extracted Mode (Quick Reference):
Considerations:
Creating and Managing Extracted Datasets in OAC
Working with Extracted mode is a straightforward process within Oracle Analytics Cloud’s interface. Here’s a step-by-step guide:
Additional Tips:
Where Extracted Mode Shines: Key Use Cases The benefits of Extracted mode become most apparent in high-demand or constrained environments. Here are several real-world examples where this mode adds tangible value:
Best Practices for Extracted Mode To ensure you get the best results from Extracted mode, consider these best practices:
Final Thoughts
The introduction of Extracted mode in Oracle Analytics Cloud marks a significant step forward in how practitioners can balance data freshness, performance, and scalability. By providing a fully materialised, high-speed dataset layer within OAC, this new mode empowers teams to deliver faster, more consistent user experiences without overloading backend systems. It’s not a silver bullet — and it won’t replace Live mode where real-time data is needed — but for many scenarios, particularly those requiring speed and stability, Extracted mode is a smart and strategic choice. With Oracle continuing to invest in features that improve accessibility, manageability, and user experience, this latest enhancement underlines the platform’s commitment to evolving enterprise analytics. Optimising Data Strategy for AI and Analytics in Oracle ADW: Reducing Storage Costs with Silk Echo10/3/2025 The Growing Challenge of Data Duplication in AI and Analytics As enterprises increasingly adopt AI-driven analytics, the demand for efficient data access continues to rise. Oracle Autonomous Data Warehouse (ADW) is a powerful platform for analytical workloads, but AI-enhanced processes—such as Agentic AI, Retrieval-Augmented Generation (RAG), and predictive modelling—place new strains on data management strategies. A key issue in supporting these AI workloads is the need for multiple data copies, which drive up storage costs and operational complexity. Traditional approaches to data replication no longer align with the scale and agility required for modern AI applications, forcing organisations to rethink how they manage, store, and access critical business data. This blog builds upon my previous post on AI Agents in the Oracle Analytics Ecosystem, further exploring how AI-driven workloads impact traditional data strategies and how organisations can modernise their approach. Why AI Workloads Demand More Data AI models, particularly those leveraging RAG, generative AI, and deep learning, require constant access to vast amounts of data. In Oracle ADW environments, these workloads often involve:
IDC has extensively documented the exponential growth of data and AI investments. Recent industry reports indicate that data storage requirements for AI workloads are expanding at an unprecedented rate. IDC’s broader research reveals several critical insights about AI’s accelerating impact on data ecosystems:
The data explosion is being fuelled by AI use cases like augmented customer service (+30% CAGR), fraud detection systems (+35.8% CAGR), and IoT analytics [1][8]. IDC emphasizes that 90% of new enterprise apps will embed AI by 2026, ensuring continued exponential data growth at the intersection of AI adoption and digital transformation [9][12]. AI data volumes are projected to increase significantly, posing challenges for enterprises striving to maintain scalable and cost-efficient storage solutions. Without proactive measures, organisations risk soaring expenses and performance limitations that could stifle innovation. Sources [1] Spending on AI Solutions Will Double in the US by 2025, Says IDC https://www.bigdatawire.com/this-just-in/spending-on-ai-solutions-will-double-in-the-us-by-2025-says-idc/ [2] IDC: Expect 175 zettabytes of data worldwide by 2025 - Network World https://www.networkworld.com/article/966746/idc-expect-175-zettabytes-of-data-worldwide-by-2025.html [3] IDC Unveils 2025 FutureScapes: Worldwide IT Industry Predictions https://www.idc.com/getdoc.jsp?containerId=prUS52691924 [4] IDC Predicts Gen AI-Powered Skills Development Will Drive $1 Trillion in Productivity Gains by 2026 https://www.idc.com/getdoc.jsp?containerId=prMETA51503023 [5] AI consumption to drive enterprise cloud spending spree - CIO Dive https://www.ciodive.com/news/cloud-spend-doubles-generative-ai-platform-services/722830/ [6] Data Age 2025: - Seagate Technology https://www.seagate.com/files/www-content/our-story/trends/files/Seagate-WP-DataAge2025-March-2017.pdf [7] IDC Predicts Gen AI-Powered Skills Development Will Drive $1 Trillion in Productivity Gains by 2026 https://www.channel-impact.com/idc-predicts-genai-powered-skills-development-will-drive-1-trillion-in-productivity-gains-by-2026/ [8] Worldwide Spending on Artificial Intelligence Forecast to Reach $632 Billion in 2028, According to a New IDC Spending Guide https://www.idc.com/getdoc.jsp?containerId=prUS52530724 [9] Time to Make the AI Pivot: Experimenting Forever Isn’t an Option https://blogs.idc.com/2024/08/23/time-to-make-the-ai-pivot-experimenting-forever-isnt-an-option/ [10] How real-world businesses are transforming with AI - with 50 new stories https://blogs.microsoft.com/blog/2025/02/05/https-blogs-microsoft-com-blog-2024-11-12-how-real-world-businesses-are-transforming-with-ai/ [11] Data growth worldwide 2010-2028 - Statista https://www.statista.com/statistics/871513/worldwide-data-created/ [12] IDC and IBM lists best practices for scaling AI as investments set to double https://www.ibm.com/blog/idc-and-ibm-list-best-practices-for-scaling-ai-as-investments-set-to-double/ [13] Nearly All Big Data Ignored, IDC Says - InformationWeek https://www.informationweek.com/machine-learning-ai/nearly-all-big-data-ignored-idc-says The Traditional Approach: Cloning Production Data Historically, organisations have relied on full database cloning to create isolated environments for AI training, model validation, and analytics. While this approach ensures data consistency, it comes with significant drawbacks:
Cost Implications of Traditional Data Cloning To put this into perspective, consider a mid-sized enterprise running an Oracle Autonomous Data Warehouse (ADW) instance with 50TB of data. If multiple teams require their own clones for model training and testing, the storage footprint could easily reach 250TB or more. With cloud storage costs averaging £0.02 per GB per month, this could result in annual expenses exceeding £60,000—just for storage alone. Factor in compute, additional database costs and administrative overhead, and the financial impact becomes even more pronounced. The challenge becomes particularly acute when considering the unique characteristics of AI workloads. Traditional RDBMS architectures were designed for transactional processing and structured analytical queries, but AI workflows introduce several distinct pressures: Data Transformation Requirements: Machine learning models often require multiple transformations of the same dataset for feature engineering, resulting in numerous intermediate tables and views. These transformations must be stored and versioned, further multiplying storage requirements. Concurrent Access Patterns: AI training workflows typically involve intensive parallel read operations across large datasets, which can overwhelm traditional buffer pools and I/O subsystems designed for mixed read/write workloads. This often leads to performance degradation for other database users. Version Control and Reproducibility: ML teams need to maintain multiple versions of datasets for experiment tracking and model reproducibility. Traditional RDBMS systems lack native support for dataset versioning, forcing teams to create full copies or implement complex versioning schemes at the application level. Query Complexity: AI feature engineering often involves complex transformations that push the boundaries of SQL optimisation. Operations like window functions, recursive CTEs, and large-scale joins can strain query optimisers designed for traditional business intelligence workloads. Resource Isolation: When multiple data science teams share the same RDBMS instance, their resource-intensive operations can interfere with each other and with production workloads. Traditional resource governors and workload management tools may not effectively handle the bursty nature of AI workloads. Additionally, the need for data freshness adds another layer of complexity. Teams often require recent production data for model training, leading to regular refresh cycles of these large datasets. This creates significant network traffic and puts additional strain on production systems during clone or backup operations. To address these challenges, organisations are increasingly exploring alternatives such as:
The financial implications extend beyond direct storage costs. Organisations must consider:
As AI workloads continue to grow, organisations need to carefully evaluate their data architecture strategy to ensure it can scale sustainably whilst maintaining performance and cost efficiency. To overcome these challenges, organisations need a solution that optimises storage usage while maintaining seamless access to real-time data. Silk Echo is a powerful tool for optimising database replication in cloud environments. It offers a range of features that improve performance, simplify management, and enhance the resiliency of data infrastructure. Silk Echo enables virtualised, lightweight data replication. Instead of creating full physical copies of datasets, it provides near-instantaneous, space-efficient snapshots that eliminate unnecessary duplication. Introducing Silk Echo: A Smarter Approach to AI Data Management Silk Echo addresses the challenge of data duplication by providing a high-performance virtualised storage layer. Instead of physically copying data into multiple environments, Silk Echo allows AI workloads, data warehouses, and vector databases to operate on a single logical copy. This reduces unnecessary duplication while maintaining high-speed access to data. How Silk Echo Works Virtualised Data Access – Silk Echo enables AI workloads to access data stored in Oracle ADW and other environments without requiring full duplication. High-Performance Caching – Frequently accessed AI data is cached efficiently to provide rapid query performance. Seamless Integration – Silk Echo integrates with Oracle ADW, vector databases, and AI model pipelines, reducing the need for repeated ETL processes. Cost Optimisation – By eliminating redundant data copies, organisations can significantly cut down on storage costs while maintaining AI performance. Silk Echo represents a shift in how enterprises approach AI and data management, ensuring that AI workloads remain cost-efficient, scalable, and manageable within Oracle ADW environments. The next step is to explore how Silk Echo integrates with specific Oracle AI use cases. Key Benefits of Silk Echo for Oracle ADW and AI Workloads Products like Silk’s Echo offering, provide a number of benefits to the RDBMS architecture that enable the efficient cost-effective support of modern AI workloads. Some of these benefits are:
Future-Proofing Oracle ADW and Oracle Analytics for AI Workloads
The rapid evolution of AI and analytics demands that organisations build future-proof architectures that can scale with new workloads. Silk Echo plays a crucial role in this by:
As AI adoption grows, businesses must rethink their data strategies to balance performance, cost, and scalability. By leveraging Silk Echo in Oracle ADW environments, organisations can:
Are You Ready to Optimise Your AI-Driven Analytics in Oracle ADW? By adopting next-generation storage solutions like Silk Echo, organisations can unlock the full potential of AI while keeping costs under control. Investing in efficient data management strategies today will ensure businesses remain competitive in the AI-driven future. The rise of Agentic AI is transforming the analytics landscape, but it comes with an often-overlooked challenge: database strain. Traditionally, operational databases are ringfenced to prevent unstructured, inefficient queries from affecting critical business functions. However, in a world where AI agents dynamically generate and execute SQL queries to retrieve real-time data, production databases are facing unprecedented pressure. Additionally, Retrieval-Augmented Generation (RAG), a rapidly emerging AI technique that enhances responses with real-time data, is further intensifying this issue by demanding continuous access to up-to-date information. RAG works by supplementing AI-generated responses with live or external knowledge sources, requiring frequent, real-time queries to ensure accuracy. This puts even more strain on traditional database infrastructures. In a previous blog post, I looked at how Agentic AI will improve the experience for users of the Oracle Analytics ecosystem. This blog explores the risks of this architectural shift where AI Agents are in opposition with the traditional RDBMS architecture, why traditional solutions such as database cloning fall short, and how modern data architectures like data lakehouses and innovative storage solutions can help mitigate these challenges. Additionally, we examine the implications for the Oracle Analytics Platform, where these changes could impact both data accessibility and performance. The Problem: AI Agents, RAG & Uncontrolled Query Load A well-managed production database is typically shielded from unpredictable query loads. Database administrators ensure that only structured, optimised workloads access production systems to avoid performance degradation. But with Agentic AI and RAG, that fundamental principle is breaking down. Instead of a few human analysts running queries, organisations may now have dozens or even hundreds of AI agents autonomously executing SQL queries in real time. These queries are often:
This creates significant challenges for traditional RDBMS architectures, which were not designed to handle the scale and unpredictability of AI-driven workloads. With Retrieval-Augmented Generation (RAG) in particular, AI models require frequent access to real-time data to enhance their outputs, placing additional stress on transactional databases. Since these databases were optimised for structured queries and controlled access, the introduction of AI-driven workloads risks causing slowdowns, performance degradation, and even system failures. For users of Oracle Analytics, this shift presents serious performance implications. If production databases are overwhelmed by AI-driven queries, query response times increase, dashboards lag, and real-time insights become unreliable. Additionally, Oracle Analytics’ AI Assistant, Contextual Insights, and Auto Insights features, which rely on efficient access to data sources, could suffer from delays or inaccuracies due to excessive load on transactional systems. To mitigate this, organisations must rethink their database strategies, ensuring that AI workloads are governed, optimised, and properly distributed across more scalable architectures. The Traditional Approach: Cloning Production Data One way that organisations have attempted to address this issue is by cloning production databases on a daily or weekly basis to offload AI-driven queries. However, this approach presents several major drawbacks:
For Oracle Analytics users, these challenges could lead to outdated insights, reduced trust in AI-generated recommendations, and a poor user experience due to lagging or inconsistent data. Given these drawbacks, it’s clear that cloning is not a viable long-term solution for handling the database demands of Agentic AI. A Shift in Data Architecture: Data Lakes & Lakehouses Instead of relying on traditional RDBMS architectures, organisations are increasingly adopting data lakes and lakehouses to support AI-driven analytics. These architectures offer several key advantages:
For users of Oracle Analytics, this shift could mean that existing reports and dashboards need to be refactored to work efficiently with a lakehouse structure, adding additional effort and complexity. Optimising Performance with Modern Storage Solutions Beyond adopting new architectural patterns, organisations can leverage modern storage solutions like Silk to mitigate the strain on production databases. Silk provides a virtualised, high-performance data layer that optimises storage performance and scalability without requiring a complete architectural overhaul. By using Silk or similar intelligent storage virtualisation and caching technologies, organisations can:
For organisations using Oracle Analytics, integrating such solutions could help sustain real-time data access while alleviating the performance burden on production databases. However, despite these advantages, storage virtualisation and caching solutions are not a panacea. Organisations must still ensure that their AI workloads are properly governed to prevent excessive resource consumption, and they need to assess whether virtualised storage aligns with their broader data architecture and security policies.
Conclusion: Preparing for the Future of AI-Driven Analytics Agentic AI and RAG are here to stay, and with them comes a fundamental shift in how data is accessed and managed. However, blindly allowing AI-driven queries to run against production databases is not a sustainable solution. To support the evolving demands of AI, organisations must modernise their data strategies by:
For Oracle Analytics users, this shift will require rethinking how data is stored, accessed, and processed to ensure that the platform continues to deliver timely insights without compromising performance. The key takeaway? Traditional database architectures were not designed for AI-driven workloads. To fully embrace the potential of Agentic AI and RAG, organisations must rethink their data foundations - or risk being left behind. How is your organisation adapting to the challenges of AI-driven analytics? Let’s continue the conversation in the comments! Introduction Structured Query Language (SQL) has been the backbone of analytics for decades, enabling users to extract, transform, and analyse data efficiently. However, with the rise of AI-driven analytics, features like AI Assistant, Auto Insights, and Contextual Insights are allowing users to interact with data without writing SQL. Does this mean SQL is becoming redundant? The answer isn’t a simple yes or no. AI is certainly abstracting SQL from business users, making analytics more accessible, but SQL still plays a critical role behind the scenes. This blog explores how AI is changing SQL’s role, where SQL is still essential, and what the future might hold. How AI is Reducing SQL’s Visibility in Oracle Analytics AI-powered features in Oracle Analytics allow users to explore data without manually writing SQL. Three key capabilities demonstrate this shift: 1. Contextual Insights: Auto-Generated SQL Behind the Scenes • Example: A sales manager sees an unexpected spike in revenue on a dashboard. Instead of running queries, they click on the data point, and Contextual Insights automatically surfaces key drivers and trends. • What happens in the background? Oracle Analytics generates SQL queries to identify correlations, anomalies, and patterns, but the user never sees or writes them. 2. AI Assistant: Querying Data Without SQL • Example: A marketing analyst wants to compare Q1 and Q2 campaign performance. Instead of writing a SQL query, they ask the AI Assistant: “Show me campaign revenue for Q1 vs Q2.” • What happens in the background? The AI Assistant translates the request into SQL, retrieves the data, and presents a visual answer. • Why it matters: Business users get answers instantly, without needing SQL expertise. 3. Auto Insights: Surfacing Trends Without Querying Data • Example: A finance team wants to understand profit fluctuations. Instead of manually querying revenue data over time, they use Auto Insights, which highlights key trends and anomalies. • What happens in the background? Oracle Analytics runs SQL queries to detect significant changes and patterns. These features make SQL less visible but not obsolete. In fact, AI relies on SQL to function effectively, which leads to the question—where is SQL still essential? Why SQL is Still Essential While AI is making SQL more accessible, it hasn’t eliminated the need for SQL expertise. Several areas still require manual intervention: 1. Handling Complex Joins & Business Logic • AI struggles with complex queries that involve multiple joins, subqueries, and conditional logic. • Example: A financial analyst wants to calculate profitability by region, requiring a multi-table join across sales, inventory, and expenses. AI might generate an inefficient or incorrect query. 2. Performance Optimisation • AI-generated SQL isn’t always the most efficient. SQL tuning (e.g., indexing, partitioning) still requires human expertise. • Example: AI might generate a query that performs a full table scan instead of leveraging an index, slowing down performance. 3. Explainability & Trust • AI-generated queries can sometimes produce unexpected results, making it difficult for users to validate the logic. • Example: If AI Assistant returns an unusual data trend, an analyst may still need to inspect the underlying SQL to ensure accuracy. SQL remains a crucial tool for data engineers, analysts, and DBAs who need control over data processing, query performance, and governance. However, as AI continues to evolve, could it overcome these challenges? The Role of the Semantic Model in AI-Driven Analytics One of the key features of Oracle Analytics is its semantic model, designed to abstract the complexity of source systems from end users. Instead of writing raw SQL queries against complex database structures, users interact with a logical layer that simplifies relationships, calculations, and security rules. Why the Semantic Model Exists The semantic model serves several purposes, including: • Hierarchies & Drilldowns: Defining business hierarchies (e.g., Year → Quarter → Month) for intuitive analysis. • Logical Joins & Business Logic: Providing a structured way to join tables without requiring users to understand foreign keys or database relationships. • Row-Level Security: Enforcing access control so users only see the data they are authorised to view. This abstraction enables self-service analytics while ensuring data governance, performance, and accuracy. Will AI Make the Semantic Model Redundant? AI-powered analytics features in Oracle Analytics Cloud (OAC)—such as Contextual Insights, AI Assistant, and Auto Insights—are reducing the need for manual query writing. But does this mean the semantic model is no longer needed? Not quite. AI currently relies on the semantic model to: • Ensure accurate and governed data access—AI cannot enforce security rules or business logic without a structured data layer. • Interpret user queries correctly—When an AI Assistant generates SQL, it uses predefined joins and relationships from the semantic model. • Maintain consistency—Without a semantic layer, different AI-generated queries might return inconsistent results due to varying assumptions about data relationships. The Future: AI-Augmented Semantic Models? Rather than replacing the semantic model, AI could enhance it by: • Auto-generating relationships & joins based on data patterns. • Improving performance optimisation, recommending indexing strategies or pre-aggregations. • Enhancing explainability, showing why certain joins or hierarchies were applied. AI and the Semantic Model Will Coexist While AI reduces the need for manual SQL, the semantic model remains essential for structured, governed, and performant analytics. The future is likely AI-assisted semantic models rather than their elimination. The Future of AI in SQL GenerationAI will likely become more sophisticated in handling SQL, but rather than eliminating it, AI will enhance SQL’s role. Here’s what the future might look like: 1. AI-Powered Query Optimisation • AI could not only generate SQL but also analyse and optimise it for better performance. • Example: Future AI Assistants might suggest indexing strategies, rewrite inefficient queries, or recommend materialised views. 2. Better Handling of Complex Joins & Business Logic • AI could integrate knowledge graphs or semantic layers to better understand relationships between tables, improving the accuracy of generated SQL. 3. Explainable AI for SQL Generation • AI might offer query rationale explanations, showing users why a specific query was generated and suggesting alternative approaches. 4. AI Agents & Autonomous Databases • AI Agents could work alongside SQL experts, automating routine queries while letting humans handle complex cases. • Oracle’s Autonomous Database could play a larger role in self-optimising SQL execution. While AI will continue to reduce the need for manual SQL writing, it is more likely to enhance SQL rather than replace it. Final Thoughts: Adapting to the AI-Driven Analytics LandscapeAI is shifting SQL from a tool business users interact with directly to something that powers insights in the background. However, this doesn’t mean SQL is going away. Instead:
• Business users will rely more on AI-driven insights without needing SQL knowledge. • Data engineers and analysts will still need SQL expertise to optimise performance, manage governance, and handle complex queries. • The future is AI-Augmented SQL, not SQL-Free Analytics. For professionals in the analytics space, this means embracing AI while still sharpening SQL skills. AI will make SQL more powerful, but those who understand both will be best positioned to leverage the full potential of Oracle Analytics. What Do You Think? Do you see AI replacing SQL in your analytics work, or do you think SQL will remain a core skill? Let’s discuss in the comments! Next Steps • If you’re interested in seeing these AI-driven analytics features in action, explore Oracle Analytics’ AI Assistant, Auto Insights, and Contextual Insights. • Stay tuned for more insights on AI’s role in modern analytics on Elffar Analytics. As artificial intelligence redefines enterprise technology, this blog explores the cutting-edge potential of AI Agents within Oracle Analytics Cloud - presenting a visionary look at how intelligent, autonomous systems will transform data analytics and decision-making. AI agents have emerged as transformative tools in the realm of analytics and are quickly becoming the "new talk of the town" in the tech world. These advanced systems, driven by large language models (LLMs) and machine learning, are at the forefront of the next wave in generative AI. For instance, OpenAI's ChatGPT now includes plugins and tool integrations that enable it to function as an AI agent capable of completing tasks like booking appointments, analysing datasets, or generating tailored recommendations. Similarly, Google's Gemini incorporates tool usage and contextual learning, making it highly adaptable to user needs. AI agents can autonomously perform tasks, analyse data, and provide actionable insights, representing a shift towards more interactive and intelligent systems. With leading LLMs introducing AI agent capabilities, it’s worth exploring how these agents can benefit Oracle Analytics Cloud (OAC) and what use cases they can unlock. What Are AI Agents? AI agents are intelligent systems designed to autonomously execute tasks, solve complex problems, and make informed decisions by leveraging their programming and available data. At a high level, they operate by combining three core capabilities:
What sets AI agents apart from traditional systems is their ability to continuously learn from interactions, dynamically adapt to new information, and simulate human-like understanding. In essence, they act as highly capable intermediaries, bridging the gap between user intent and actionable insights. They can:
In analytics, AI agents amplify user productivity by reducing manual intervention and enabling data-driven decision-making at scale. Oracle Digital Assistant: An Example of Agentic AI Oracle Digital Assistant (ODA) is a prime example of how Oracle has already embraced agentic AI. ODA combines conversational AI and task automation to enable businesses to build interactive, intelligent chatbots. While traditional chatbots often follow predefined workflows with limited intelligence, ODA incorporates machine learning and NLP to adapt to user queries dynamically. It can integrate with Oracle applications, providing users with personalised recommendations, automating repetitive tasks, and enhancing the overall user experience. By acting as a virtual assistant that understands context and intent, ODA showcases how Oracle has been leveraging AI agent-like capabilities to enhance enterprise productivity. Benefits of AI Agents in Oracle Analytics Cloud OAC already offers powerful AI and machine learning features, such as natural language querying, auto-insights, and contextual insights. Integrating AI agents with OAC can enhance these capabilities by:
Use Cases for AI Agents in OAC Here are some practical scenarios where AI agents can make a significant impact. These scenarios demonstrate how AI agents can bridge the gap between user intent and actionable insights, leveraging their intelligent processing capabilities to enhance productivity, efficiency, and decision-making:
Existing AI Agent Functionality in OAC OAC already incorporates several AI-driven features that mimic AI agent functionality, including capabilities that proactively assist users, streamline analytics tasks, and enhance decision-making. These features leverage automation, machine learning, and natural language processing to deliver intelligent insights and actionable recommendations, much like a fully realised AI agent would.
Conclusion
The integration of AI agents into Oracle Analytics Cloud represents a transformative leap forward in enterprise analytics. By seamlessly blending autonomous intelligence with data analysis, OAC is poised to redefine how organisations extract, interpret, and act on insights. These AI agents will not merely enhance current workflows—they will fundamentally reimagine the analytics experience, enabling predictive, proactive, and personalised decision-making at an unprecedented scale. As AI continues to evolve, Oracle Analytics Cloud stands at the forefront of a paradigm shift. While competitors like Microsoft Power BI and Tableau are exploring AI-driven features, OAC has the potential to leapfrog traditional approaches by embedding truly intelligent, contextually aware agents that can autonomously navigate complex data landscapes. The future of analytics is not just about reporting—it's about creating intelligent systems that anticipate needs, generate insights, and drive strategic actions with minimal human intervention . The journey of AI agents in Oracle Analytics Cloud is just beginning, promising a new era of data intelligence that transforms how organisations understand and leverage their most critical asset: information.
As organisations strive to make faster, smarter decisions, analytics tools must evolve to offer more than static dashboards and manual data exploration. Enter Contextual Insights, a game-changing feature set to debut in Oracle Analytics Cloud (OAC) as part of the January 2025 update.
Thanks to the Oracle Analytics Product Management team, I was given early access to this feature and have had the opportunity to explore it hands-on. In this blog, I’ll share my insights, experience, and feedback to help you understand the transformative potential of Contextual Insights. What Are Contextual Insights Contextual Insights are dynamically generated insights that appear based on the context of the data being analysed. Powered by Oracle’s advanced AI algorithms, these insights surface anomalies, trends, and patterns without requiring users to manually search for them. For instance, imagine you’re analysing sales data for a retail chain. Contextual Insights could highlight a sudden drop in sales for a specific region or a spike in returns for a particular product category—all without you needing to ask. Key Features of Contextual Insights 1. AI-Powered Recommendations Contextual Insights leverage AI to suggest deeper analysis opportunities, such as exploring correlations between variables or detecting unusual behaviour in your data. 2. Dynamic Visualisations The insights are not just textual; they include visual representations such as trend lines, scatter plots, or bar charts, making the findings easier to understand at a glance. 3. User-Centric Design These insights are tailored to the user’s role and the context of their query, ensuring that the most relevant information is surfaced. 4. Seamless Integration Contextual Insights work seamlessly with other OAC features like Auto Insights, Narratives, and the Natural Language Query (NLQ) interface.
How Contextual Insights Fit Into OAC
Oracle has designed Contextual Insights to seamlessly integrate into the Oracle Analytics Cloud (OAC) experience, enhancing usability while preserving the intuitive workflows users rely on. This integration ensures that insights appear naturally during the analytical process, enriching user interactions without introducing complexity. Key Examples of Integration: • Enhanced Visualisation Analysis: When users view a visualisation, Contextual Insights proactively surfaces anomalies, outliers, or unexpected trends specific to the data currently in focus. For instance, a sales trend chart might highlight a sudden dip in revenue for a specific region or product line, prompting immediate investigation. • Ad-Hoc Analysis Empowerment: In exploratory scenarios, where users are conducting ad-hoc analysis, Contextual Insights helps uncover patterns or correlations that might not have been apparent. For example, when analysing marketing campaign data, it might reveal that a spike in customer engagement correlates with specific demographic segments or external factors. A Unified Workflow for Enhanced Decision-Making: This tight integration allows Contextual Insights to complement the natural flow of analysis, ensuring that users discover actionable insights without needing to disrupt their existing processes or switch contexts. By blending seamlessly with tools like dashboards, visualisations, and exploration interfaces, Contextual Insights empowers users to focus on making data-driven decisions rather than spending time searching for information. Bridging Technical Gaps: For non-technical users, this integration is particularly valuable, as it eliminates the need for advanced data expertise to identify meaningful insights. Meanwhile, for advanced analysts, it acts as a catalyst, streamlining deeper exploration and enabling quicker hypothesis testing. By embedding Contextual Insights deeply within OAC, Oracle delivers an analytics experience that is not only smarter but also more intuitive and inclusive, ensuring users at all levels can uncover hidden opportunities with ease.
Why Contextual Insights Matter
In a world where data-driven decisions are becoming the norm, the ability to uncover relevant, actionable insights at the right moment can be the difference between staying ahead of the curve and falling behind. 1. Democratisation of Analytics Contextual Insights empower users who may not have technical expertise in data analytics, widening the reach of OAC across organisations. 2. Enhanced Productivity By surfacing insights automatically, analysts save time previously spent on manual data exploration. 3. Proactive Decision-Making Contextual Insights shift the focus from reactive reporting to proactive planning by identifying trends and anomalies in real-time. How Contextual Insights Complements Auto Insights While Auto Insights in Oracle Analytics Cloud focuses on providing users with automatically generated, high-level summaries and narratives about their data,
Contextual Insights takes this a step further by tailoring those insights to the user’s specific context. Auto Insights excels at offering a broad overview, such as key performance indicators or summarised trends, whereas Contextual Insights dynamically surface patterns, anomalies, and trends based on the user’s immediate data interactions. Together, these features create a seamless experience where users can move from a high-level understanding to in-depth exploration, uncovering actionable insights with minimal effort. This synergy ensures that users at all skill levels can maximise the value of their data, moving from descriptive to diagnostic analytics effortlessly
Real-World Use Cases
Contextual Insights unlock powerful opportunities across a variety of business sectors by surfacing patterns, trends, and anomalies that have the potential to drive more informed and proactive decision-making.
How to Get Started Once the January 2025 update is live, enabling and using Contextual Insights will be straightforward. 1. It is configured at a visualisation level. 2. Ensure that Contextual Insights is enabled for your visualisations. . 3. Select the data item to analyse and select the "Explain Selected" option from the context menu. For more detailed steps, refer to Oracle’s YouTube videos below explaining the feature and also detailing the configuration steps to set up Contextual Insights.
Final Thoughts
Contextual Insights represent a major leap forward in empowering organisations to make faster, smarter, and more informed decisions. By integrating advanced AI-driven capabilities directly into Oracle Analytics Cloud, this feature enables users of all skill levels to uncover hidden opportunities and respond proactively to emerging trends. As analytics tools evolve, features like Contextual Insights showcase Oracle’s commitment to democratising analytics and fostering innovation. Whether you’re a data novice or a seasoned data scientist, Contextual Insights can transform how you explore and act on your data. Embrace this feature to unlock the full potential of your analytics workflows and drive meaningful outcomes in your organisation. If you’re ready to explore its possibilities or need support with OAC, reach out or leave a comment below! In today’s data-driven world, the ability to transform unstructured data into actionable insights is critical for organisations. With the November 2024 release of Oracle Analytics Cloud (OAC), the integration of OCI Document Understanding that brings cutting-edge capabilities to businesses looking to unlock the value hidden in their documents has been extended to allow users to register custom models. You can find out more information on how to create a custom model here. In this blog, we will be looking at document understanding and how it fits into analytics. Here’s how this feature empowers analytics workflows. From Unstructured to Actionable: The Role of Text Extraction Many essential business processes rely on unstructured documents such as contracts, invoices, shipping manifests, and feedback forms. These documents often contain vital data, but their formats - PDFs, scanned images, or handwritten forms - make extracting and analysing this data manually a time-consuming and error-prone process. Key Benefits of Text Extraction for Analytics Text extraction is often viewed as a preliminary step rather than an intrinsic part of analytics, but this perspective underestimates its transformative impact on modern data workflows. In today’s organisations, vast amounts of critical information remain trapped in unstructured formats - documents, emails, contracts, and scanned images. Without the ability to extract and structure this data, analytics initiatives risk missing out on valuable insights hidden in plain sight like having a collection of . By integrating text extraction directly into analytics workflows, businesses not only bridge the gap between unstructured and structured data but also enhance the scope and accuracy of their insights. While it may seem that text extraction belongs solely to the domain of data preparation, its seamless integration into analytics platforms changes the game. By enabling users to work directly with previously inaccessible information, text extraction ensures that analytics becomes truly comprehensive. This convergence eliminates the need for siloed processes, accelerates decision-making, and empowers users to leverage their data assets fully. As the lines between data preparation and analytics blur, text extraction proves itself not as a separate utility but as an essential enabler of meaningful, end-to-end analytics workflows. Some of the benefits of integrating text extraction with analytics are: 1. Streamlined Data Preparation Extracted text is ready for analysis without requiring extensive manual intervention. For example, a retail company can process thousands of supplier invoices, extracting line-item details such as product names, prices, and quantities. This structured data feeds into Oracle Analytics for further preparation and enrichment, such as cleansing inconsistent naming conventions or enriching data with external sources. 2. Improved Decision-Making By leveraging the extracted text, users can create dashboards that provide actionable insights. A logistics company, for example, might track delivery times and costs across suppliers, identifying inefficiencies and opportunities to renegotiate contracts. 3. Cross-Document Analysis OCI Document Understanding enables businesses to analyse trends across a corpus of documents. A financial institution can aggregate key metrics from thousands of contracts, such as interest rates or repayment terms, to assess portfolio risk and optimise lending strategies. 4. Advanced Search and Contextual Insights Once text is extracted, it can be indexed and searched, enabling users to locate specific terms or patterns across document sets. For instance, legal teams can identify clauses that might expose the organisation to risk, while sales teams can quickly review terms in customer contracts to tailor offers. Registering a Pre-trained Document Key Value Extraction Model in Oracle Analytics Cloud Oracle Analytics Cloud provides access to some pre trained OCI document understanding models. This process allows you to leverage the AI capabilities of OCI Document Understanding within OAC to automatically extract key data points from your documents. Here are the detailed steps involved: Access the Model Registration Function: Begin by navigating to the OAC Home Page. In the top right corner, locate the three-dot menu (ellipsis) and select "Register Model/Function." From the options presented, choose "OCI Document Understanding Models" Establish the OCI Connection: Next, you'll need to select your OCI connection. If you haven't already established a connection between OAC and OCI, you'll be prompted to create one. This connection is crucial as it enables OAC to interact with the OCI Document Understanding service. Select the Desired Model Type: Once the OCI connection is established, a "Select a Model" window will appear. Choose "Pretrained Document Key Value Extraction" as the model type. This specific model is designed to identify and extract key data from documents, such as merchant names, addresses, and total prices. Specify the OCI Bucket and Document Type: In the right-side panel of the "Select a Model" window, you'll need to provide two crucial pieces of information:
Provide a Model Name and Register: Finally, give your model a descriptive name for easy identification within OAC. Click "Register" to complete the process. You can view your registered model under the "Models" tab in the Machine Learning page of OAC. By following these steps, you successfully register a pre-trained document key value extraction model in OAC, setting the stage for streamlined data preparation and enhanced data analysis. You can then create data flows within OAC to apply this registered model to your documents, extract the desired key values, and use this structured data to generate valuable insights. You may also create your own custom model and register it for use in Oracle Analytics Cloud as well if the pre trained models are not fit for your specific use case and this is the new aspect of this feature that has been added in the November 2024 update.
Summary In conclusion, the integration of text extraction capabilities within analytics workflows represents a pivotal advancement for organisations striving to unlock the full potential of their data. By transforming unstructured content into actionable insights, tools like OCI Document Understanding within Oracle Analytics Cloud bridge the gap between data preparation and analysis, enabling faster, more accurate decision-making. While debates may persist about whether text extraction is a standalone process or part of analytics, its value in delivering comprehensive, data-driven outcomes is undeniable. As businesses continue to navigate an increasingly data-rich landscape, embracing these capabilities will be key to maintaining a competitive edge. Oracle Analytics have just announced the general availability of the AI Assistant. It is a new feature that allows users to interact with their data using natural language. This means you can ask questions in plain English and the assistant will generate visualisations and insights based on your dataset. Democratisation of Data Analytics: The AI assistant makes data analysis accessible to a wider range of users, not just those with advanced technical skills or even those with a formidable understanding of the data visualisation tool. By allowing users to interact with data using natural language, it eliminates the need for complex SQL queries or programming knowledge. This democratisation empowers more people within an organisation to explore data and derive valuable insights. This feature is part of Oracle’s broader AI-driven efforts within Oracle Analytics Cloud, which also includes integration with machine learning models and advanced natural language generation for smart data narratives Below, is a summary of the steps to follow to get the AI Assistant up and running:
This Oracle Analytics blog provides detailed information of these simple steps needed to set this up. It requires some configuration to enable the feature at a dataset level. To improve the user experience, setting up synonyms for the dataset attributes is useful as this gives you the option to provide alternative names for each attribute which will improve the results generated by the AI Assistant. Remember to choose the option to index the dataset for the Assistant. There is also an option to index the dataset for the Assistant and the Homepage Ask feature if required. Upon initial review, the Oracle Analytics AI Assistant shows significant potential for enhancing productivity and easing the workflow, particularly for those less familiar with advanced analytics tools. While the setup, particularly dataset configuration and indexing, may require some upfront effort, once configured, the tool truly shines. The integration of natural language querying makes it much easier to interact with the data, reducing the need to master the complexities of Oracle Analytics Cloud (OAC). The feature’s ability to generate instant visual insights from natural language inputs is particularly valuable. This means that once your dataset is indexed and ready, you can quickly derive meaningful insights, spot trends, or explore relationships in the data without having to dive into manual visualisation processes. Auto Insights can also uncover patterns you may not have initially considered, adding further value by highlighting hidden connections. Subject Areas This version of the AI Assistant does not currently have the capability to support subject areas. However, this feature is part of the planned future enhancements and is expected to be integrated into the Assistant in due course. Once implemented, it will allow users to engage with the Assistant more effectively, offering deeper insights and tailored assistance based on specific subject areas. Oracle Analytics AI Assistant is a noteworthy addition to the Oracle Analytics platform. It offers a more intuitive approach to data analysis by allowing users to interact with data using natural language. While this feature can simplify certain tasks and provide valuable insights, it's important to note that it may not replace the need for human expertise in all cases. As with any AI-powered tool, it's crucial to understand its limitations and use it in conjunction with traditional data analysis methods. Update - 19th September 2024
After some conversations with the Product Management team , I have gathered some more information about the AI Assistant pertaining to the rollout process and some details about security and data privacy which you can find below:. Oracle’s objective is to make the AI Assistant available at no additional cost across all OCPU shapes. However, its release will be managed through a phased, controlled rollout that will span several Oracle Analytics Cloud updates. This approach ensures optimal performance and stability while addressing key concerns around privacy and security. The Large Language Model (LLM) powering the Assistant is highly resource-intensive and Oracle-managed. Due to its proprietary nature and status as protected intellectual property, specific details regarding the LLM are limited. However, it is important to note that each Oracle Analytics Cloud instance is provisioned with its own private, Oracle-managed LLM. To further safeguard data privacy, all interactions between the LLM and each individual Oracle Analytics Cloud instance remain entirely within the customer’s OCI environment. At no point do these interactions leave the customer’s infrastructure, ensuring complete data security and confidentiality. |
AuthorA bit about me. I am an Oracle ACE Pro, Oracle Cloud Infrastructure 2023 Enterprise Analytics Professional, Oracle Cloud Fusion Analytics Warehouse 2023 Certified Implementation Professional, Oracle Cloud Platform Enterprise Analytics 2022 Certified Professional, Oracle Cloud Platform Enterprise Analytics 2019 Certified Associate and a certified OBIEE 11g implementation specialist. Archives
February 2025
Categories
All
|