Smarter Screens, Smarter Solutions : Oracle’s UX Revolution using AI with VBCS

Introduction

In Oracle Cloud, tools such as VBCS and OIC make it easy to build modern applications. They help establish connections with various systems without difficulty. We can create custom apps with VBCS, OIC, and DB. This way, we meet user needs and fulfill all business requirements.

With OCI’s AI services, we can create apps that lower user effort. These apps help users make smart decisions by adding intelligence to their experience.

OCI’s AI services provide different services like Vision, Language and Document Understanding. Using these services, the following things:

  • Read text from images and extract useful data (such as invoices or receipts).
  • Understand the emotion behind user feedback.
  • Analyze images and identify objects.

We can create smart applications by using VBCS for the user interface and OIC/DB for business logic. In this blog, we will walk through how we can combine VBCS, OIC, and OCI’s AI services. We will look at how these tools work together. First, we’ll cover the simple architecture. Then, we’ll share practical use cases for this setup. Finally, we’ll outline the key steps to design and implement them.

Note: The focus of this blog is to give ideas about the processes and what’s possible rather than deep technical details or code.

Oracle Tools / Oracle Technology Stack

We are going to use the following Oracle Cloud offerings.

VBCS:

We will be using VBCS to develop the user interface. We can build clean and responsive UIs using VBCS along with JavaScript when needed. VBCS application will be the frontend which users see.

OIC:

OIC will be used for handling all the business logic. Data transformation, API calls, and connecting to the third-party systems will be handled by the OIC. It will act as a bridge between VBCS applications and OCI AI services.

OCI AI Services:

Oracle offers ready-to-use AI services like

  • Vision – for analyzing the images
  • Language – for extracting meaning or tone
  • Document Understanding – for reading and analyzing documents

You do not need to train the models or handle complex data science. These services are prebuilt and easy to call from OIC using the REST APIs.

Use Case – AP Invoices Capture using Document Understanding

In most of the cases, teams receive the invoices as PDFs or scanned images.

After receiving the document, user must manually enter all the required details like supplier name, invoice number, invoice date, amount etc. This is a repetitive task which also consumes lots of time and effort of the user.

With the help of VBCS, OIC and OCI’s AI service of Document Understanding, we can automate most of this process and make it much faster.

Business Scenario:

User uploads the image in VBCS application. After uploading an image, the file is sent to Oracle’s AI service which reads the document and pulls out key fields. The values appear on the VBCS page after OIC sends a response. The user can verify them before they submit the details to Oracle ERP.

High Level Process Flow:

  1. Users upload an invoice in VBCS form
  2. OIC picks up the file and sends it OCI Document Understanding
  3. AI extracts the key data and send the response back to OIC
  4. OIC maps and stores the required data and the same is displayed to user
  5. User then reviews the details and submits the form
  6. A new invoice record is created in Oracle ERP
    7. VBCS UI shows the status (Success/Error) after submitting to Oracle ERP

How to Achieve it:

  • In VBCS, use the file upload component to upload the images
  • In OIC, build an integration that takes the file and calls the OCI Document Understanding API, and parses the response
  • Map the extracted fields received in the response
  • Build an exception handling mechanism if the user review is not required. Such as if AI cannot read the file or some key fields are missing

Note:

For Data Accuracy, we need to keep few things in mind.

While OCI document understanding does good job with common invoice formats, accuracy can vary depending on the quality of the uploaded document and how structured it is.

Below are few tips to improve the results:

  • Good Document Quality
    • Clear and High-resolution images (300 DPI or more) works best
    • Avoid documents with stamps or skewed layouts
  • Use the right extraction model
    • OCI document understanding offers multiple extraction models like key-value extraction, table extraction etc.
    • Choose the one that best fits for your document type, or test a few to see which works the best for your needs
  • Post processing in OIC
    • You can use data enrichment steps in OIC to clean or reformat OIC results
    • For example, if the AI outputs a date in the wrong format, you can correct it before sending it to ERP.
  • Confidence Scores
    • OCI responses usually include confidence level for each extracted field
    • You can set a threshold (e.g. only accept a value is confidence level is >90% ) and flag the other values for manual review
  • Analyze data and train custom model
    • You can store the failed or low confidence invoices and analyze them later to retrain custom models or to adjust the integration logic accordingly
    • This way we can make the process more reliable and scalable over the time

Key Learnings, Challenges and Best Practices

    • Design for latency
      • Allow for short delays from AI calls by using loaders, async triggers, or batch processing.
    • Validate AI results through Human reviews
      • Always give users the ability to accept, edit or reject AI generated output
    • Develop reusable flows
      • Create generic OIC flows (like Vision API handler) which you can reuse across multiple applications

    More expansion towards this topic

    Using OCI AI services in Oracle Cloud Applications may sound a new complication but as we have seen it is very much achievable using the available Oracle Cloud offerings like VBCS, OIC and OCI AI Services.

    There are other possible interesting use cases as well which can be implemented using VBCS, OIC and OCI AI Services effectively.

    • Product classification through OCI Vision
      • Leverage OCI Vision’s image recognition to automatically categorize product images, with OIC orchestrating data ingestion and model invocation.
        VBCS provides an intuitive dashboard for users to review and refine classification results in real time.
    • Feedback analysis using OCI language
      • Utilize OCI Language to perform sentiment analysis, entity recognition, and topic extraction on customer feedback ingested via OIC. VBCS shows important insights and trends in real time. This helps stakeholders fix problems fast and boost customer satisfaction.
    • Smart forms for compliance reporting
      • Build intelligent VBCS forms that leverage OCI AI Services to auto‑extract and validate compliance data from uploaded documents, with OIC handling backend workflows.
        The solution makes regulatory reporting easier. It adds real-time validation, error detection, and approval steps right in the app.

    By combining right tools and thoughtful architecture like databases for control and logic, OIC for middleware and AI for insights, you can deliver scalable solutions.

    Work. Laugh. Repeat. The Orbrick Way

    Work. Laugh. Repeat. The Orbrick Way

    They say your first job leaves an indelible mark, becoming the foundation for the rest of your professional life, kind of like the first time you try to make instant noodles and accidentally set off the smoke alarm. Whether you realize it or not, it’s the place where your core values begin to take shape, guiding you through future endeavours and relationships. As a 2024 graduate stepping into the corporate world, Orbrick Consulting has become more than just my first workplace – it’s my first crash course in adulting. 

    What is work culture, really? Is it the coffee breaks where we vent about managers and laugh it off? The table tennis matches that only end when someone mutters, “Maybe we should work”? Or the pizza and coke after surviving tough clients? Maybe it’s more than that—it’s about people. It’s about those small efforts that make everyone, from the CEO to the office attendant, want to show up on a Monday morning (yes, I said Monday). It’s about creating an environment where seeing your manager walk over with more tasks feels like a challenge to grow, not a reason to wish you were elsewhere. 

    Here, fostering a healthy work culture isn’t just a goal—it’s a commitment. From prioritizing mental health to creating spaces where everyone feels heard, innovative steps are woven into our daily lives. Mental health, being the cornerstone of well-being, is given utmost priority. We have alternate-day meditation sessions, including manifestation practices (or just a good lunch), to help us centre ourselves. There’s also a unique benefit—a dedicated mental health leave employees can take anytime during the year. At our entrance, a well-being wall with four quadrants invites us to share how we’re feeling each day, creating a space for reflection, and helping the company understand and support us better. Orbrick doesn’t just hear us; it listens, making every day a little brighter, even Mondays. 

    At the company, we take “team bonding” to a whole new level with our weekly series, “Know Your Brickster” It’s basically show-and-told for grown-ups, where one lucky Brickster gets to spill the beans about their life while everyone else asks everything from “What’s your hidden talent?” to “Pineapple on pizza: yes or no?” For an hour, it’s not about work, it’s about the person behind the desk. It’s a fun, slightly chaotic way to turn colleagues into friends, break the ice, and remind us that behind every email is a real human. 

    At Orbrick Consulting, learning doesn’t stop at your desk—it thrives in our Knowledge Transfer Sessions where the company’s leaders generously share their expertise (and occasionally their life hacks). From “How to Enhance Your Business Knowledge” to “Mastering MS Office Tools” these sessions are packed with insights that bridge the gap between theory and practice. We’ve explored topics like functional and technical roles, project management basics, and even the fine art of change management. It’s like TED Talks, but with fewer buzzwords and more practical takeaways. These sessions not only sharpen our skills but also inspire us to think bigger, work smarter, and inch closer to becoming the best versions of ourselves.

    At Orbrick, physical health and creativity walk hand in hand—literally. Employees who hit 6,000 steps for at least 85% of the quarter are rewarded with prizes and a heartfelt appreciation letter from our leaders. Think of it as getting fit with a side of fame. For the bookworms (or aspiring ones), there’s a new initiative: read 12 books in a year and earn rewards. And then there’s Spark Tank, our quarterly showdown of brilliance. Teams brainstorm three technical and feasible ideas around a tagline, and the winning idea not only gets bragging rights but also a share of the profits when it hits the market. 

    In the end, your first job is never just about the work; it’s about figuring out how to look busy during a surprise meeting. it’s been a rollercoaster of growth, laughter, and plenty of “what just happened?” moments. If this is the foundation for my career, I’d say it’s off to one great of a start. Here’s to first jobs, lessons that last, and a lifetime of figuring it all out – one coffee at a time. 

    Processing Large Data in Oracle ERP System

    Introduction

    Oracle Fusion ERP helps businesses to manage their operations and to make decisions from data available in the system. However, the problem arises when data volumes increase as organizations start to grow. Working with large datasets in the system will lead to performance issues like executing large reports, integrations that rely on the data from the reports, or data extractions. For technical teams, handling huge volumes of data effectively may be quite challenging.

    Large datasets can cause several performance issues, like:

    1. Report Failures: Scheduled reports or online reports can often fail or time out due to large amounts of data causing system limits.
    2. Slow integrations or failures: When working with big datasets, scheduled procedures or API-based integrations may suffer from performance deterioration, which might lead to delays in data processing or synchronization.
    3. Issues in data extraction: It may be essential to split or segment data when exporting large datasets for analytics or compliance reasons, as they might exceed system-imposed size restrictions.
    4. User frustration: Decision-making becomes more difficult when processes are disrupted by frequent failures or delays in getting crucial information.

    Such challenges arise from inefficient techniques for process design and data extraction or reporting, along with the inherent limits of handling huge volumes of data. In addition, it’s critical to resolve these performance issues while recognizing Oracle Fusion ERP’s limitations, especially the absence of direct database access and a strong dependence on seeded tools like BI Publisher, OTBI (Oracle Transactional Business Intelligence), and REST/SOAP APIs for data management.

    Reports and integrations that do not function properly could lead to delays in business-critical operations like supply chain management, regulatory filings, and financial closes. To ensure seamless operation, maintain user productivity, and increase system dependability, performance optimization for huge data reporting and associated procedures becomes vital.

    This blog tackles the challenges of handling large data sets in Oracle Fusion ERP, offering practical solutions to enhance performance. From effective filtering to advanced techniques like chunking and bursting, we address the root causes of slow reports, integrations, and data extractions. By adopting these best practices, you can optimize performance and future-proof your processes against the growing demands of a data-intensive business environment.

    Performance Tuning

    When we talk about the improvement in the performance of the query or code, we can divide it into two major parts:

    • Code Optimization
    • Data Size

    In code optimization, we can use various methods like rearranging the query in a structured way by checking the joins and removing or realigning the joins properly. In the data size part, we can break down data in the output itself by adding proper filters to reduce the size of the data generated by the SQL query, or we can use the chunking functionality provided by Oracle to split the data into multiple outputs.

    Code Optimization

    The first and most important step in deconstructing the problem of the big data collection is code optimization. We must first examine the SQL query’s flows and reorganize it in relation to the results.  The query might encounter issues with performance due to the possibility of missing or improper joins. To fix these kinds of problems, we must first divide the queries into smaller components and then debug the queries to find the problematic portion. Code optimization can majorly be divided into three parts:

    1. Process outside ERP
      1. Process Data in Warehouse System
      2. Third Party Tools
    2. Query Re-Arrange
      1. Use of proper joins
      2. Removal of subqueries
      3. Remove unnecessary data
    3. Query Tuning
      1. Use of views
      2. Hints

    The above diagram shows various ways to tackle the issue, which is caused by the code; we can use these different ways to resolve the issue. Let us discuss these methods in detail.

    Process Outside ERP System

    As we know, Oracle Fusion is a transactional database.  There will be some limitations to it; to overcome these limitations in performance, we can use Data warehouse systems, or we can get the help of various third-party tools available in the market like Power BI.

    Process Data in Warehouse System:

    Data warehouse systems efficiently handle large ERP data volumes. Oracle Analytics Cloud (OAC) seamlessly integrates with Oracle ERP, allowing custom subject areas in the presentation layer, similar to OBIEE, or direct SQL query building. To generate reports, ERP data must be periodically extracted via scheduled reports, Business Intelligence Cloud Connector (BICC), or integrations. This reduces data load, ensuring smooth processing. Warehousing is crucial for large reports like GL transactions or Aging reports, which are difficult to manage in transactional ERP databases

    Use of third-party tools:

    Various third-party tools for reporting are available in the market, which can be used to generate the report in various ways. These tools can provide more insights in a better way than the reporting tool available in the ERP system. Some of these tools are Power BI (a Microsoft Product), SmartView for Oracle (can be used in Excel), or APEX Dashboards. These tools provide greater flexibility and efficiency in handling large datasets compared to standard ERP reporting solutions. By offloading data processing to external tools, businesses can significantly enhance report performance, minimize timeouts, and ensure smoother operations without being constrained by ERP limitations.

    Power BI allows users to connect directly to Oracle Fusion ERP data sources, process and visualize data more effectively, and apply advanced analytics without overloading the ERP system. By extracting and transforming data externally, reports can be generated faster and with richer insights.

    Oracle APEX is another powerful tool that enables the development of lightweight web applications and dashboards. It can be used to create customized reports that fetch and process data outside of the ERP, reducing the burden on the system while improving response times.

    Query Re-Arrange

    When we talked about processing the data outside of the ERP system, we needed to use third-party tools or data warehouse systems, which can be costly and require high maintenance. To save a cost, we can optimize the query by removing subqueries and merging them into the main queries or we can relook at the joins we have used in the query and rearrange them better way we can remove unnecessary data from the query to optimize the performance. This way we can save money and improve the performance of the query and get the desired results.

    Proper Joins

    The query will not perform well if proper joins are not being used in SQL. As a standard, we have to always use a proper join to fetch the data. It is essential to use proper joins to retrieve the data. For large reports, if the joins are not maintained properly the query might not provide a desired result or in worse scenarios, it will not run at all. It is recommended to check the Oracle documents in case of the query performance issue to identify any obsolete columns or missed joins as the first approach in the query rearrangement.

    Remove Subqueries

    Another bottleneck issue in the time-consuming queries is sub-queries. Sub-queries are good if they do not consume more resources, but in most cases, subqueries consume more resources and can generate duplicate or improper data. Below are the pros and cons of using sub queries:

    Before jumping to any conclusion about using the subqueries or not, it is always recommended to go through the explain plan generated by the SQL to identify the part that is causing the issue. If it is the subquery part that is causing the problem, then it is advisable to use it as Common Table Expressions (CTE). CTEs are generally faster in executing the way databases process and optimize them. Below is the explanation of when to use the subquery and when not to:

    • When to use subqueries:
      • When dealing with the smaller amount of data
      • When using aggregation logics
      • When inline calculation needed
      • When filtering data in where clause
    • When not to use subqueries:
      • When dealing with larger data sets
      • When it is feasible to use CTE for faster executions
      • When require using the same logic multiple times
      • When query performance is important
    Scenario Use a subquery? Better Alternative?
    Simple queries with filters or aggregations ✅ Yes N/A
    Checking for existence (IN, EXISTS) ✅ Yes N/A
    Queries involving large datasets ❌ No CTEs or JOINs
    Performance-sensitive queries ❌ No JOINs
    Recursive queries (e.g., Org Hierarchy) ❌ No Recursive CTEs
    Ranking and row-based operations ❌ No RTF / Excel Functions

    Remove Unnecessary Data

    After applying the above-mentioned optimization techniques, if there is still a performance issue, then it is recommended to remove unnecessary columns from the query. We have seen many times there are not-so-important columns added in the query, which can cause performance issues and slow down the execution time. It is recommended to exclude these columns to improve the performance of the query.

    Query Tuning

    The final part in the optimization of the query is tuning. As mentioned earlier, we need to check the explain plan to identify the part that is causing the performance issue. In the tuning part we need to rewrite the code based on the available resources in the Oracle ERP system, as direct database access isn’t available in Fusion. As a developer, we can make proper use of the views provided by the oracle or usage of the hints.

    Use of views:

    Views that are provided by the oracle are generally faster than the normal tables in execution due to several factors like proper joins maintained in the view logic, performance enhancements via indexing, etc. For e.g., To fetch the data of supplier in oracle fusion Instead of using POZ_SUPPLIERS and joining it with multiple tables, it is advisable to use view POZ_SUPPLIERS_V which is provided by an oracle with some of the useful columns required for the query.

    Use of Hints:

    Oracle SQL hints are instructions that help the optimizer determine the optimal query execution strategy. Hints can help optimize performance in Oracle Fusion, particularly when querying huge datasets in BI Publisher.