Smarter Screens, Smarter Solutions : Oracle’s UX Revolution using AI with VBCS

Introduction

In Oracle Cloud, tools such as VBCS and OIC make it easy to build modern applications. They help establish connections with various systems without difficulty. We can create custom apps with VBCS, OIC, and DB. This way, we meet user needs and fulfill all business requirements.

With OCI’s AI services, we can create apps that lower user effort. These apps help users make smart decisions by adding intelligence to their experience.

OCI’s AI services provide different services like Vision, Language and Document Understanding. Using these services, the following things:

  • Read text from images and extract useful data (such as invoices or receipts).
  • Understand the emotion behind user feedback.
  • Analyze images and identify objects.

We can create smart applications by using VBCS for the user interface and OIC/DB for business logic. In this blog, we will walk through how we can combine VBCS, OIC, and OCI’s AI services. We will look at how these tools work together. First, we’ll cover the simple architecture. Then, we’ll share practical use cases for this setup. Finally, we’ll outline the key steps to design and implement them.

Note: The focus of this blog is to give ideas about the processes and what’s possible rather than deep technical details or code.

Oracle Tools / Oracle Technology Stack

We are going to use the following Oracle Cloud offerings.

VBCS:

We will be using VBCS to develop the user interface. We can build clean and responsive UIs using VBCS along with JavaScript when needed. VBCS application will be the frontend which users see.

OIC:

OIC will be used for handling all the business logic. Data transformation, API calls, and connecting to the third-party systems will be handled by the OIC. It will act as a bridge between VBCS applications and OCI AI services.

OCI AI Services:

Oracle offers ready-to-use AI services like

  • Vision – for analyzing the images
  • Language – for extracting meaning or tone
  • Document Understanding – for reading and analyzing documents

You do not need to train the models or handle complex data science. These services are prebuilt and easy to call from OIC using the REST APIs.

Use Case – AP Invoices Capture using Document Understanding

In most of the cases, teams receive the invoices as PDFs or scanned images.

After receiving the document, user must manually enter all the required details like supplier name, invoice number, invoice date, amount etc. This is a repetitive task which also consumes lots of time and effort of the user.

With the help of VBCS, OIC and OCI’s AI service of Document Understanding, we can automate most of this process and make it much faster.

Business Scenario:

User uploads the image in VBCS application. After uploading an image, the file is sent to Oracle’s AI service which reads the document and pulls out key fields. The values appear on the VBCS page after OIC sends a response. The user can verify them before they submit the details to Oracle ERP.

High Level Process Flow:

  1. Users upload an invoice in VBCS form
  2. OIC picks up the file and sends it OCI Document Understanding
  3. AI extracts the key data and send the response back to OIC
  4. OIC maps and stores the required data and the same is displayed to user
  5. User then reviews the details and submits the form
  6. A new invoice record is created in Oracle ERP
    7. VBCS UI shows the status (Success/Error) after submitting to Oracle ERP

How to Achieve it:

  • In VBCS, use the file upload component to upload the images
  • In OIC, build an integration that takes the file and calls the OCI Document Understanding API, and parses the response
  • Map the extracted fields received in the response
  • Build an exception handling mechanism if the user review is not required. Such as if AI cannot read the file or some key fields are missing

Note:

For Data Accuracy, we need to keep few things in mind.

While OCI document understanding does good job with common invoice formats, accuracy can vary depending on the quality of the uploaded document and how structured it is.

Below are few tips to improve the results:

  • Good Document Quality
    • Clear and High-resolution images (300 DPI or more) works best
    • Avoid documents with stamps or skewed layouts
  • Use the right extraction model
    • OCI document understanding offers multiple extraction models like key-value extraction, table extraction etc.
    • Choose the one that best fits for your document type, or test a few to see which works the best for your needs
  • Post processing in OIC
    • You can use data enrichment steps in OIC to clean or reformat OIC results
    • For example, if the AI outputs a date in the wrong format, you can correct it before sending it to ERP.
  • Confidence Scores
    • OCI responses usually include confidence level for each extracted field
    • You can set a threshold (e.g. only accept a value is confidence level is >90% ) and flag the other values for manual review
  • Analyze data and train custom model
    • You can store the failed or low confidence invoices and analyze them later to retrain custom models or to adjust the integration logic accordingly
    • This way we can make the process more reliable and scalable over the time

Key Learnings, Challenges and Best Practices

    • Design for latency
      • Allow for short delays from AI calls by using loaders, async triggers, or batch processing.
    • Validate AI results through Human reviews
      • Always give users the ability to accept, edit or reject AI generated output
    • Develop reusable flows
      • Create generic OIC flows (like Vision API handler) which you can reuse across multiple applications

    More expansion towards this topic

    Using OCI AI services in Oracle Cloud Applications may sound a new complication but as we have seen it is very much achievable using the available Oracle Cloud offerings like VBCS, OIC and OCI AI Services.

    There are other possible interesting use cases as well which can be implemented using VBCS, OIC and OCI AI Services effectively.

    • Product classification through OCI Vision
      • Leverage OCI Vision’s image recognition to automatically categorize product images, with OIC orchestrating data ingestion and model invocation.
        VBCS provides an intuitive dashboard for users to review and refine classification results in real time.
    • Feedback analysis using OCI language
      • Utilize OCI Language to perform sentiment analysis, entity recognition, and topic extraction on customer feedback ingested via OIC. VBCS shows important insights and trends in real time. This helps stakeholders fix problems fast and boost customer satisfaction.
    • Smart forms for compliance reporting
      • Build intelligent VBCS forms that leverage OCI AI Services to auto‑extract and validate compliance data from uploaded documents, with OIC handling backend workflows.
        The solution makes regulatory reporting easier. It adds real-time validation, error detection, and approval steps right in the app.

    By combining right tools and thoughtful architecture like databases for control and logic, OIC for middleware and AI for insights, you can deliver scalable solutions.

    Processing Large Data in Oracle ERP System

    Introduction

    Oracle Fusion ERP helps businesses to manage their operations and to make decisions from data available in the system. However, the problem arises when data volumes increase as organizations start to grow. Working with large datasets in the system will lead to performance issues like executing large reports, integrations that rely on the data from the reports, or data extractions. For technical teams, handling huge volumes of data effectively may be quite challenging.

    Large datasets can cause several performance issues, like:

    1. Report Failures: Scheduled reports or online reports can often fail or time out due to large amounts of data causing system limits.
    2. Slow integrations or failures: When working with big datasets, scheduled procedures or API-based integrations may suffer from performance deterioration, which might lead to delays in data processing or synchronization.
    3. Issues in data extraction: It may be essential to split or segment data when exporting large datasets for analytics or compliance reasons, as they might exceed system-imposed size restrictions.
    4. User frustration: Decision-making becomes more difficult when processes are disrupted by frequent failures or delays in getting crucial information.

    Such challenges arise from inefficient techniques for process design and data extraction or reporting, along with the inherent limits of handling huge volumes of data. In addition, it’s critical to resolve these performance issues while recognizing Oracle Fusion ERP’s limitations, especially the absence of direct database access and a strong dependence on seeded tools like BI Publisher, OTBI (Oracle Transactional Business Intelligence), and REST/SOAP APIs for data management.

    Reports and integrations that do not function properly could lead to delays in business-critical operations like supply chain management, regulatory filings, and financial closes. To ensure seamless operation, maintain user productivity, and increase system dependability, performance optimization for huge data reporting and associated procedures becomes vital.

    This blog tackles the challenges of handling large data sets in Oracle Fusion ERP, offering practical solutions to enhance performance. From effective filtering to advanced techniques like chunking and bursting, we address the root causes of slow reports, integrations, and data extractions. By adopting these best practices, you can optimize performance and future-proof your processes against the growing demands of a data-intensive business environment.

    Performance Tuning

    When we talk about the improvement in the performance of the query or code, we can divide it into two major parts:

    • Code Optimization
    • Data Size

    In code optimization, we can use various methods like rearranging the query in a structured way by checking the joins and removing or realigning the joins properly. In the data size part, we can break down data in the output itself by adding proper filters to reduce the size of the data generated by the SQL query, or we can use the chunking functionality provided by Oracle to split the data into multiple outputs.

    Code Optimization

    The first and most important step in deconstructing the problem of the big data collection is code optimization. We must first examine the SQL query’s flows and reorganize it in relation to the results.  The query might encounter issues with performance due to the possibility of missing or improper joins. To fix these kinds of problems, we must first divide the queries into smaller components and then debug the queries to find the problematic portion. Code optimization can majorly be divided into three parts:

    1. Process outside ERP
      1. Process Data in Warehouse System
      2. Third Party Tools
    2. Query Re-Arrange
      1. Use of proper joins
      2. Removal of subqueries
      3. Remove unnecessary data
    3. Query Tuning
      1. Use of views
      2. Hints

    The above diagram shows various ways to tackle the issue, which is caused by the code; we can use these different ways to resolve the issue. Let us discuss these methods in detail.

    Process Outside ERP System

    As we know, Oracle Fusion is a transactional database.  There will be some limitations to it; to overcome these limitations in performance, we can use Data warehouse systems, or we can get the help of various third-party tools available in the market like Power BI.

    Process Data in Warehouse System:

    Data warehouse systems efficiently handle large ERP data volumes. Oracle Analytics Cloud (OAC) seamlessly integrates with Oracle ERP, allowing custom subject areas in the presentation layer, similar to OBIEE, or direct SQL query building. To generate reports, ERP data must be periodically extracted via scheduled reports, Business Intelligence Cloud Connector (BICC), or integrations. This reduces data load, ensuring smooth processing. Warehousing is crucial for large reports like GL transactions or Aging reports, which are difficult to manage in transactional ERP databases

    Use of third-party tools:

    Various third-party tools for reporting are available in the market, which can be used to generate the report in various ways. These tools can provide more insights in a better way than the reporting tool available in the ERP system. Some of these tools are Power BI (a Microsoft Product), SmartView for Oracle (can be used in Excel), or APEX Dashboards. These tools provide greater flexibility and efficiency in handling large datasets compared to standard ERP reporting solutions. By offloading data processing to external tools, businesses can significantly enhance report performance, minimize timeouts, and ensure smoother operations without being constrained by ERP limitations.

    Power BI allows users to connect directly to Oracle Fusion ERP data sources, process and visualize data more effectively, and apply advanced analytics without overloading the ERP system. By extracting and transforming data externally, reports can be generated faster and with richer insights.

    Oracle APEX is another powerful tool that enables the development of lightweight web applications and dashboards. It can be used to create customized reports that fetch and process data outside of the ERP, reducing the burden on the system while improving response times.

    Query Re-Arrange

    When we talked about processing the data outside of the ERP system, we needed to use third-party tools or data warehouse systems, which can be costly and require high maintenance. To save a cost, we can optimize the query by removing subqueries and merging them into the main queries or we can relook at the joins we have used in the query and rearrange them better way we can remove unnecessary data from the query to optimize the performance. This way we can save money and improve the performance of the query and get the desired results.

    Proper Joins

    The query will not perform well if proper joins are not being used in SQL. As a standard, we have to always use a proper join to fetch the data. It is essential to use proper joins to retrieve the data. For large reports, if the joins are not maintained properly the query might not provide a desired result or in worse scenarios, it will not run at all. It is recommended to check the Oracle documents in case of the query performance issue to identify any obsolete columns or missed joins as the first approach in the query rearrangement.

    Remove Subqueries

    Another bottleneck issue in the time-consuming queries is sub-queries. Sub-queries are good if they do not consume more resources, but in most cases, subqueries consume more resources and can generate duplicate or improper data. Below are the pros and cons of using sub queries:

    Before jumping to any conclusion about using the subqueries or not, it is always recommended to go through the explain plan generated by the SQL to identify the part that is causing the issue. If it is the subquery part that is causing the problem, then it is advisable to use it as Common Table Expressions (CTE). CTEs are generally faster in executing the way databases process and optimize them. Below is the explanation of when to use the subquery and when not to:

    • When to use subqueries:
      • When dealing with the smaller amount of data
      • When using aggregation logics
      • When inline calculation needed
      • When filtering data in where clause
    • When not to use subqueries:
      • When dealing with larger data sets
      • When it is feasible to use CTE for faster executions
      • When require using the same logic multiple times
      • When query performance is important
    Scenario Use a subquery? Better Alternative?
    Simple queries with filters or aggregations ✅ Yes N/A
    Checking for existence (IN, EXISTS) ✅ Yes N/A
    Queries involving large datasets ❌ No CTEs or JOINs
    Performance-sensitive queries ❌ No JOINs
    Recursive queries (e.g., Org Hierarchy) ❌ No Recursive CTEs
    Ranking and row-based operations ❌ No RTF / Excel Functions

    Remove Unnecessary Data

    After applying the above-mentioned optimization techniques, if there is still a performance issue, then it is recommended to remove unnecessary columns from the query. We have seen many times there are not-so-important columns added in the query, which can cause performance issues and slow down the execution time. It is recommended to exclude these columns to improve the performance of the query.

    Query Tuning

    The final part in the optimization of the query is tuning. As mentioned earlier, we need to check the explain plan to identify the part that is causing the performance issue. In the tuning part we need to rewrite the code based on the available resources in the Oracle ERP system, as direct database access isn’t available in Fusion. As a developer, we can make proper use of the views provided by the oracle or usage of the hints.

    Use of views:

    Views that are provided by the oracle are generally faster than the normal tables in execution due to several factors like proper joins maintained in the view logic, performance enhancements via indexing, etc. For e.g., To fetch the data of supplier in oracle fusion Instead of using POZ_SUPPLIERS and joining it with multiple tables, it is advisable to use view POZ_SUPPLIERS_V which is provided by an oracle with some of the useful columns required for the query.

    Use of Hints:

    Oracle SQL hints are instructions that help the optimizer determine the optimal query execution strategy. Hints can help optimize performance in Oracle Fusion, particularly when querying huge datasets in BI Publisher.

    Implementing ERP with Downstream Applications: Strategies for Decommissioning  Introduction 

    Introduction 

    In today’s fast-changing business world, firms use ERP systems. They aim to improve operations and data management. ERP systems integrate many business processes into a unified system. This streamlines data handling and oversight. But integrating ERP systems with downstream apps presents big opportunities and challenges. Downstream apps interact with or depend on the ERP. This blog covers the need for integration, its challenges, and ways to decommission redundant downstream apps. 

    Overview of ERP and Downstream Applications 

    Enterprise Resource Planning (ERP) systems are platforms. They manage and automate core business functions. These include finance, HR, supply chain, and customer relationships. ERP systems centralize data. They provide a single source of truth for processes.  

    Specialized tools downstream interact with the ERP system. These can include CRM software, supply chain systems, or industry apps. These apps serve specific functions that the ERP system may not address completely. 

    Importance of Integrating ERP with Downstream Applications for Data Management 

    Integrating ERP systems with downstream applications is vital for effective data management. Seamless integration allows data to flow between systems. It reduces redundancy and improves accuracy. This integration helps decision-making, boosts efficiency, and ensures consistency across business units. 

    Challenges and Impacts 

    Data consistency and accuracy 

    • Challenge: Keeping data updated in all systems can be difficult. Discrepancies in data formats, structures, or real-time updates can lead to inaccuracies. 
    • Impact: Inconsistent data can result in erroneous reporting, financial discrepancies, and operational inefficiencies. 

    Complexity of Integration 

    • Challenge: Integrating ERP systems with many downstream apps is complex. It involves mapping data fields, aligning formats, and managing different schemas. 
    • Impact: High complexity can cause delays, high demand of time, and a need for tech experts. 

    Real-Time Data Processing 

    • Challenge: It’s tough to sync real-time data between ERP and downstream apps, especially with large data volumes. 
    • Impact: Delays in data sync can cause outdated info. This affects decisions and efficiency. 

    User Training and Adoption 

    • Challenge: Train users to use the new integrated systems and workflows. 
    • Impact: Poor training can lead to user errors, resistance to change, and a less effective system. 

     

    Case Study where ERP Compared with other system 

    • HCM (Human Capital Management) and CRM (Customer Relationship Management)  

    Used When: To manage your workforce and customers. Ideal for businesses that prioritize employee growth and customer engagement. This includes service-oriented companies.  

    Benefits: Improved HR and customer processes raise employee and customer satisfaction. 

    • SCM, FIN, and CX  

    Used When: You need to manage supply chains, financial operations, and improve customer experience. This is common in manufacturing and retail. There, delivery and finances are critical.  

    Benefits: Optimized supply chains cut costs. Strong finances ensure sustainability. Better CX boosts brand loyalty. 

    • HCM, CRM, SCM, FIN, and CX Combined  

    Used When: Your organization needs a unified approach across all functions. It’s vital for large firms in competitive markets to have cohesive strategies.  

    Benefits: A unified system boosts communication and data sharing. It aligns strategies across departments, increasing efficiency and growth. 

                By assessing your business needs and goals, you can find the best mix of these systems for peak performance. 

    Before implementing an ERP system, check what functions existing systems can manage. We can group this analysis into three types: doable, tweakable, and non-doable cases. 

    1. Doable Cases

    These are functions that other systems can handle, with little change. Examples:

    • Inventory Management: Many standalone systems can track stock, manage reorders, and provide reports. 
    • Customer Relationship Management (CRM): CRM systems can manage customer interactions and sales pipelines. They do this without an ERP. 
    • Accounting and Financial Management: Good accounting software can handle financial tasks and reports. 

    Implication: In these cases, organizations may keep existing systems. They already meet operational needs efficiently. 

    1. Tweakable Cases

    Other systems can perform these functions. But, they need customization or integration to match the ERP’s capabilities. Examples:

    • Human Resources Management: HR software can manage employee records and payroll. But, it may need customization to fit the organization’s processes. It should also work with other systems, like finance and inventory. 
    • Supply Chain Management: Existing systems may support logistics. But, they may need extra setup to ensure full tracking and reporting across the supply chain. 
    • Project Management: Project management tools can track tasks and timelines. But, integrating financial data from accounting software may need custom solutions. 

    Implication: Organizations may invest in customization to improve efficiency. But, it may raise costs and complexity. 

    1. Non-Doable Cases

    These are functions that existing systems can’t manage. They are limited by tech or poor integration. Examples:

    • Comprehensive Reporting: Some systems can generate reports. But, they often can’t pull data from multiple sources. So, they can’t create insights across departments. 
    • Regulatory Compliance: Many standalone systems are not integrated. They can’t automate compliance processes. This makes it hard to manage documentation and reporting. 
    • Integrated Workflow Management: Standalone systems may not support the ERP’s end-to-end workflow. This can cause gaps in communication and efficiency. 

    Implication: For these cases, an ERP system may be vital. No other solution can fully replicate its required functions. It is key for efficiency and compliance. 

    Objective and key Steps for Decommissioning Downstream Applications 

    Phase 1: Assessment and Identification 

    1. Identify Low-Value Applications
    • Objective: Identify applications that are non-essential or of little value to the organization. 

    Key Steps 

    Compile a List of Downstream Applications  Conduct a Value Assessment  Prioritize Applications 
    1. Evaluate Integration Feasibility
    • Objective: Assess the applications’ integration with the ERP and other systems. This will help us understand the complexity of decommissioning. 

    Key Steps 

    Map Integration Points  Review Integration Contracts and SLAs  Develop a Decommissioning Strategy 

      

    Phase 2: Planning and Communication 

    1. Develop a Decommissioning Plan
    • Objective: Create a detailed plan for decommissioning the identified applications. 

    Key Steps 

    Define Objectives and Scope  Create a Timeline  Assign Responsibilities and Identify Resources 
    1. Communicate with Stakeholders
    • Objective: Inform all relevant parties about the decommissioning process and their roles. 

    Key Steps 

    Develop Communication Plan  Conduct Briefings  Provide Training and Support 

      

    Phase 3: Execution and Migration 

    1. Migrate Data and Processes
    • Objective: Transfer or archive data and adjust processes to ensure continuity after decommissioning. 

    Key Steps 

    Data Migration  Process Transition  Test Data and Processes 

     

    Conclusion 

    In today’s fast-changing business world, we must link ERP systems with downstream apps. This optimizes data management and boosts efficiency. This integration improves data accuracy and reduces redundancy. It streamlines business processes. The result is better decision-making and higher performance. But, the integration process has challenges. These include data consistency, system complexity, and real-time processing issues. Effective decommissioning of redundant downstream apps needs a phased approach. Start with assessment and identification. Then, plan and communicate in detail. Finally, execute and migrate. By addressing these aspects, organizations can ensure a smooth transition. Their systems will then be efficient, accurate, and aligned with business goals. 

    Call To Action 

    To maximize the benefits of ERP integration with downstream apps, organizations must be strategic. Start by checking your downstream apps for redundancy and low value. Next, assess the feasibility of integrating these apps with your ERP system. This will help you understand the complexities involved. Make a detailed decommissioning plan. Communicate well with all stakeholders. This will ensure a smooth transition. Execute the migration with precision. It is vital to ensure data integrity and process continuity. These steps will help your organization. They will streamline operations, improve data accuracy, and boost efficiency. Begin your decommissioning process today. It will optimize your ERP integration and improve business outcomes. 

     

    APEX vs VBCS: How to make the right choice?

    Low code tool is new buzzword here from sometime now. Platforms which enabled developer community to produce applications, with fraction of efforts and very few line of codes. These platforms empower developers to create software solutions with minimal manual coding, leveraging visual interfaces, pre-built components, and drag-and-drop functionality.

    Absolutely, Oracle has recognized the importance of low-code development and has positioned itself as a key player in this space. Instead of one, Oracle has 2 solutions in Low-Code tool space: APEX (Application Express) and VBCS (Visual Builder Cloud Service). But Why 2 low-code tools? What is oracle doing? & Which tool to choose as developer or end customer?

    Here are how oracle places both tools on its site:

    APEX:

    VBCS:

    The main difference is clear in title itself – APEX: “Data Driven Application” & VBCS: “Build Extensions for cloud applications”.

    Apex is here from around 20 Years, and it is one of the oracle’s prime development tools. They have invested a lot into it and improved it constantly. Although it builds very robust modern looking webapps but at the core it uses simple SQL and PLSQL. Old school developers find this very friendly. It can be run on cloud or on premise. This tool is for both citizen and professional developers. Citizen developers are those who knows only very basics of coding and intend to create lighter applications. It allows developers to leverage their SQL and PL/SQL skills to build data-centric applications rapidly.

    On the other hand, VBCS is a newer addition to Oracle’s low-code offerings. The reason was to have a tool which truly cloud, web-based platform using JavaScript. VBCS is based on Oracle JET, which stands for JavaScript Extension Toolkit. JavaScript is de facto standard for web application development. It gives web application developer an advanced experience of what they generally do. It supports git management, docker and CI/CD which makes it more robust from DevOps perspective. For simpler apps it is near no-code. The main use case of VBCS is for expanding and add extensions to any oracle cloud service like oracle fusion. VBCS integrates seamlessly with other Oracle Cloud services, making it a compelling choice for organizations already invested in Oracle’s cloud ecosystem. The data source is generally a rest end points, making it truly neutral to database. One of the main advantages is its ability to create truly native mobile application.

    APEX VBCS
    Positioning 20x Faster w 100X less code Extensions to oracle cloud apps
    Cost Free with oracle database. $ per OCPU. It counts message packs in relation to active users and OIC integration calls it makes
    Data Source: Developer Mindset Oracle Database: You can think of as a relational data model structure underneath  Rest End points: Plan/Design Rest endpoints carefully. It can be any rest enabled database schema also
    DevOPS APEX Team Development: APEX Maintains application inside database hence do not compatible with DevOps Tools Visual Builder Studio: Better than APEX DevOps as it supports git, CI/CD, Docker etc. Visual Builder has better product life cycle management
    WebAPP Y Y
    Mobile Application development Y
    Oracle SaaS extensions Y
    Product Maturity and bigger community Y
    Deployment It sits on top of Oracle Database hence it is tightly coupled with it. APEX can be deployed at both on-premise or on cloud It is neutral to oracle database and truly webapp/mobileApp development platform. Deployed on-cloud.
    Developer Skill as Prerequisites SQL & PLSQL Javascript, Rest, Oracle Jet, Database

    So now let’s try to answer the questions we raised initially:

    Why two low-code tools instead of one?

    The decision likely stems from Oracle’s recognition that different developers have different preferences, skill sets, and project requirements. By offering both APEX and VBCS, Oracle can cater to a wider audience and provide solutions tailored to various use cases. Additionally, having multiple low-code platforms allows Oracle to address different aspects of application development, from data-centric applications with APEX to more modern, user interface-focused applications with VBCS. This strategy enables Oracle to stay competitive in the rapidly evolving low-code development market and better serve the diverse needs of its customers.

    Which to Choose?

    Go for VBCS if require oracle fusion extension app. Go for VBCS if want mobile application. Go for VBCS if want truly platform neutral webapp.

    Go for APEX if current skillset if SQL/PLSQL. Go for APEX if want to keep maintenance cost low.

    By considering these factors, organizations or developers can make informed decisions about which Oracle low-code platform best suits their specific requirements and goals. Whether prioritizing skill compatibility, cost-effectiveness, or specialized functionalities, Oracle offers options to meet diverse needs in application development.

    Conclusion:

    It is indeed a tale of 2 siblings. Both are children of powerful father named Oracle. Both, APEX and VBCS, benefit from Oracle’s vast resources, developer community, user group and commitment to provide an ease to learn & continuous product improvement.

    Elder One, APEX, is traditional old-school (SQL/PLSQL), robust, proven, well established in life and conservative with money (free with DB).

    The younger child, VBCS, is modern (JavaScript, mobile & true webapp), innovative, looking to solve many futuristic problems, expensive but promising.