Mittwoch, 17. Januar 2024

Leveraging OCI Document Understanding AI for Document Processing

Leveraging AI for Document Processing with the Document AI Scanner

Oracle's new Document Understanding AI provides a powerful way to extract insights from scanned documents using artificial intelligence. Built on Oracle APEX and integrating with the Oracle Cloud Infrastructure Document Understanding AI service, this open source tool that I developed streamlines the intake and analysis of document collections.

I was first introduced to Document AI's capabilities and REST API through Jon Dixon's insightful blog Quickly Turn Text into Data with APEX & Document AI

In his blog, Jon provides a comprehensive overview of Document AI and demonstrates how to call the AnalyzeDocument endpoint directly from an APEX application. 

What it Does


The Document AI Scanner allows users to upload multiple PDF or image files for processing as a batch. It then passes the documents to the Document Understanding AI service which performs text extraction, table and field detection, document classification and more.

Once analysis is complete, users can view detailed results within the app or download them as a ZIP file. Key-value pairs and other extracted data can also be exported directly to an accounting database for further processing.

Key Capabilities

  • Full text extraction and searchability of uploaded documents
  • Identification of tables and structured fields
  • Classification of document types like invoices or forms
  • Language detection to support proper NLS settings
  • Long running jobs that don't time out
  • Downloadable analysis files for record keeping

Behind the Scenes

The app leverages algorithms to process the raw AI output. Using techniques like relative positioning and data type validation, it forms key-value pairs from text fragments.

My client needed to process German bills, but the text recognition was limited to the ASCII character set. This meant German umlauts and "sz" were missing from extracted text, and date/number formats were misinterpreted. To address this, I developed my own text interpreter by hand-crafting rules to handle these language-specific cases and produce better key-value pairs from the documents.

  • The absence of German characters in the OCR output made it hard to match names and addresses within the address table. A special index had to be built to support matching despite missing characters like umlauts and ß.
  • Dates, numbers and currencies are formatted differently in German compared to English. It took work to parse these values accurately based on local conventions.
  • German abbreviations and acronyms that may be used in bills needed to be researched and accounted for in the interpreter rules.
  • Idiomatic phrases, terminology and jargon specific to billing/invoicing in German posed challenges to understand and interpret correctly.
  • Ambiguous or incomplete text fragments from OCR output limited the amount of context available to accurately interpret meanings. Additional heuristics may have been needed.
  • Ensuring the interpreter worked robustly across a variety different bill/invoice styles, templates and edge cases from various issuing companies.
  • Debugging and testing the complex set of hand-crafted rules required considerable effort to refine the interpreter to a useful level of accuracy.
  • Integrating the custom interpreter with the existing application architecture and output formats required careful programming.
It was a challenge to develop the customized interpreter within the limited capabilities of the general Document AI service. My hope is that Oracle continues expanding the service to support more languages out of the box. Alternatively, the ability to train custom models is valuable, but requires significant labeled data - which I did not have for this exact task. Overall the app demonstrates how artificial intelligence can be leveraged through a combination of commercial and custom solutions.

The Document AI Scanner application showcases how technical challenges with language and interpretation can be overcome through a combination of commercial and custom solutions. While developing the bespoke text interpreter involved effort, it allowed valuable insights to be extracted from an otherwise limited data set.

For those interested in exploring the application further or contributing enhancements, the source code is available on GitHub at:

https://github.com/dstrack/Document-AI-Scanner

Please check out the repo for full implementation details, installation instructions, and how to get involved with ongoing development. I hope this project helps advance capabilities for automated document processing globally.



Dienstag, 11. Januar 2022

Automatically publish translations after changes in an APEX app

Seed & Publish Translations Plug-In

There is a little annoyance that you encounter when you develop multilingual APEX apps. When your primary language is English and the current session language is different for example German, then changes in your applications will not be displayed in that session, because APEX continues to display an older version that you have published earlier. Even when the change is invisible program logic it will be ignored.

A required task for APEX developers is: When any modification is made to your primary application, perform the "Seed" and "Publish" operations to recreate an updated version of your translated application(s). I had to repeat this task dozens of times by tweaking the layout for a second language.

You have to seed and publish the translations to see the change in your translated session. Since APEX provides API functions for the seeding and publishing of translations and metadata about the last publishing date can be retrieved from a view, this can be automated with an APEX processing Plug-In. While implementing the plug-in I copied a solution for the Data Browser App that can publish the translated apps in a background process. When you run the API calls as described in the APEX API Reference book by setting the environment with apex_util.set_security_group_id you can run the apex_lang.seed_translations and apex_lang.publish_application inside a procedure that is called in a scheduler job. When the procedure is defined with AUTHID DEFINER, then the scheduler jobs are enabled to find the translations. But when you execute this in an anonymous code block, then the call to apex_util.set_security_group_id has to be replaced by a call to apex_session.attach to enable the apex_lang API calls.

This Plug-In keeps the translated applications up-to-date automatically by seeding and publishing the running application with an asynchronous background job.

The Plug-In solves the following problems:

  1. When you develop multilingual APEX applications, you have to repeat the "Seed" and "Publish" operations as often as you make any changes. For example: When your application's primary language is English and you inspect the german version of your app you may find issues with truncated labels because of the space for much longer item labels. When you change the "Column Span" of an item you have to repeat the publish translations operation to see the result of your change in the german app.
  2. When you install multilingual APEX applications on a different website via the import application process, you have to manually perform the publish operation to see the translated version of your app.
  3. You have to wait many seconds until the "Seed" and "Publish" operations are completed before you can continue with your work.

The Plug-In process performs the following steps:

  • Retrieve the last updated date of the application.
  • Retrieve the value of a preference variable PUBLISH_TRANSLATIONSXX where XX is the application id.
  • Compares the last updated date with the preference variable and quit when the values are equal.
  • Store the last updated date in the preference variable.
  • Create a small script to seed & publish the app in all supported languages.
  • Execute the script immediately or asynchronous.
See the GitHub repository for more details:

A simple Demo App is provided in the repository and runs at:



Samstag, 3. Juli 2021

APEX API Tracer

Generate APEX_DEBUG Calls to Trace Parameter Values for whole packages

APEX API Tracer enables the tracing of calls to package procedures or functions into the APEX Debug Log without manual programming.

This program can generate a package for tracing automatically when the following conditions apply:

  1.   the package is accessible to the schema user via a synonym.
  2.   the procedure or function is listed in the package header.
  3.   the package header is not wrapped.

The enable - procedure will generate a package with the same name as the synonym in your local schema.

The link from your local applications to the synonym will be intercepted by the generated package.

The package will contain the same functions and procedures as the original package.

When the Option 'Logging of Procedure Start' is set to Yes, then each function and procedure consists of:  

  1. A call to APEX_DEBUG.LOG_LONG_MESSAGE to produce log entries with text for valid PL/SQL procedure calls with enquoted parameters, that you can copy and paste into the sql console for testing. Arguments of type IN and IN/OUT are logged.
  2. A call to invocate the original procedure or functions.
  3. A call to APEX_DEBUG.LOG_LONG_MESSAGE to produce log entries for the output values and return values of the invocation. Arguments of type OUT and IN/OUT are logged.

When the Option 'Logging of Procedure Start' is set to No, then each function and procedure consists of:  

  1. A call to invocate the original procedure or functions. 
  2. A call to APEX_DEBUG.LOG_LONG_MESSAGE to produce log entries for all argument values and function return values of the invocation. Arguments of type IN, OUT, and IN/OUT are logged.

For other packages where the above-mentioned conditions do not apply, you can manually add generated code with invocations to the api_trace package to support logging for prepared functions or procedures in your own packages. 

Usage

Install the APEX application with supporting objects.

Start the Application and use the Switch in the column 'Tracing is Enabled' to start and stop the tracing of a package. The app will list for some packages the grant statements that you have to execute as an admin. 


Example output for the package APEX_LANG:

 create or replace NONEDITIONABLE PACKAGE       "APEX_LANG"  
 is  
 -- This package was generated by package_tracer from Strack Software Development, Berlin, Germany.  
 -- It replaces : PUBLIC SYNONYM "APEX_LANG" for "APEX_190100"."HTMLDB_LANG"  
 function message (  
   p_name           in varchar2 default null,  
   p0             in varchar2 default null,  
   p1             in varchar2 default null,  
   p2             in varchar2 default null,  
   p3             in varchar2 default null,  
   p4             in varchar2 default null,  
   p5             in varchar2 default null,  
   p6             in varchar2 default null,  
   p7             in varchar2 default null,  
   p8             in varchar2 default null,  
   p9             in varchar2 default null,  
   p_lang           in varchar2 default null,  
   p_application_id      in number default null)  
   return varchar2  
   ;  
 procedure message_p (  
   p_name           in varchar2 default null,  
   p0             in varchar2 default null,  
   p1             in varchar2 default null,  
   p2             in varchar2 default null,  
   p3             in varchar2 default null,  
   p4             in varchar2 default null,  
   p5             in varchar2 default null,  
   p6             in varchar2 default null,  
   p7             in varchar2 default null,  
   p8             in varchar2 default null,  
   p9             in varchar2 default null,  
   p_lang           in varchar2 default null,  
   p_application_id      in number  default null)  
   ;  
 function lang (  
   p_primary_text_string    in varchar2 default null,  
   p0             in varchar2 default null,  
   p1             in varchar2 default null,  
   p2             in varchar2 default null,  
   p3             in varchar2 default null,  
   p4             in varchar2 default null,  
   p5             in varchar2 default null,  
   p6             in varchar2 default null,  
   p7             in varchar2 default null,  
   p8             in varchar2 default null,  
   p9             in varchar2 default null,  
   p_primary_language     in varchar2 default null)  
   return varchar2  
   ;  
 procedure create_message(  
   p_application_id in number,  
   p_name      in varchar2,  
   p_language    in varchar2,  
   p_message_text  in varchar2 )  
   ;  
 procedure update_message(  
   p_id      in number,  
   p_message_text in varchar2 );  
 procedure delete_message(  
   p_id in number );  
 procedure update_translated_string(  
   p_id    in number,  
   p_language in varchar2,  
   p_string  in varchar2);  
 procedure seed_translations(  
   p_application_id in number,  
   p_language    in varchar2 );  
 procedure create_language_mapping(  
   p_application_id       in number,  
   p_language          in varchar2,  
   p_translation_application_id in number);  
 procedure update_language_mapping(  
   p_application_id       in number,  
   p_language          in varchar2,  
   p_new_trans_application_id  in number);  
 procedure delete_language_mapping(  
   p_application_id in number,  
   p_language    in varchar2);  
 procedure publish_application(  
   p_application_id      in number,  
   p_language         in varchar2,  
   p_new_trans_application_id in number default null );  
 procedure emit_language_selector_list;  
 end APEX_LANG;  
 /  
   
 create or replace NONEDITIONABLE PACKAGE BODY       "APEX_LANG"  
 IS   
 function message (  
   p_name           in varchar2 default null,  
   p0             in varchar2 default null,  
   p1             in varchar2 default null,  
   p2             in varchar2 default null,  
   p3             in varchar2 default null,  
   p4             in varchar2 default null,  
   p5             in varchar2 default null,  
   p6             in varchar2 default null,  
   p7             in varchar2 default null,  
   p8             in varchar2 default null,  
   p9             in varchar2 default null,  
   p_lang           in varchar2 default null,  
   p_application_id      in number default null)  
   return varchar2  
     
 is  
   lv_result VARCHAR2(32767);  
 begin  
   lv_result := apex_190100.htmldb_lang.message(p_name,p0,p1,p2,p3,p4,p5,p6,p7,p8,p9,p_lang,p_application_id);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.message('  
    || 'p_Name=>' || api_trace.Literal(p_Name)  
    || ', p0=>' || api_trace.Literal(p0)  
    || ', p1=>' || api_trace.Literal(p1)  
    || ', p2=>' || api_trace.Literal(p2)  
    || ', p3=>' || api_trace.Literal(p3)  
    || ', p4=>' || api_trace.Literal(p4)  
    || ', p5=>' || api_trace.Literal(p5) || chr(10)  
    || ', p6=>' || api_trace.Literal(p6)  
    || ', p7=>' || api_trace.Literal(p7)  
    || ', p8=>' || api_trace.Literal(p8)  
    || ', p9=>' || api_trace.Literal(p9)  
    || ', p_Lang=>' || api_trace.Literal(p_Lang)  
    || ', p_Application_Id=>' || api_trace.Literal(p_Application_Id) || ')' || ' returns '  
    || api_trace.Literal(lv_result), p_level=>5);  
   return lv_result;  
 end message;  
   
 procedure message_p (  
   p_name           in varchar2 default null,  
   p0             in varchar2 default null,  
   p1             in varchar2 default null,  
   p2             in varchar2 default null,  
   p3             in varchar2 default null,  
   p4             in varchar2 default null,  
   p5             in varchar2 default null,  
   p6             in varchar2 default null,  
   p7             in varchar2 default null,  
   p8             in varchar2 default null,  
   p9             in varchar2 default null,  
   p_lang           in varchar2 default null,  
   p_application_id      in number  default null)  
     
 is  
 begin  
   apex_190100.htmldb_lang.message_p(p_name,p0,p1,p2,p3,p4,p5,p6,p7,p8,p9,p_lang,p_application_id);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.message_p('  
    || 'p_Name=>' || api_trace.Literal(p_Name)  
    || ', p0=>' || api_trace.Literal(p0)  
    || ', p1=>' || api_trace.Literal(p1)  
    || ', p2=>' || api_trace.Literal(p2)  
    || ', p3=>' || api_trace.Literal(p3)  
    || ', p4=>' || api_trace.Literal(p4)  
    || ', p5=>' || api_trace.Literal(p5) || chr(10)  
    || ', p6=>' || api_trace.Literal(p6)  
    || ', p7=>' || api_trace.Literal(p7)  
    || ', p8=>' || api_trace.Literal(p8)  
    || ', p9=>' || api_trace.Literal(p9)  
    || ', p_Lang=>' || api_trace.Literal(p_Lang)  
    || ', p_Application_Id=>' || api_trace.Literal(p_Application_Id) || ')', p_level=>5);  
 end message_p;  
   
 function lang (  
   p_primary_text_string    in varchar2 default null,  
   p0             in varchar2 default null,  
   p1             in varchar2 default null,  
   p2             in varchar2 default null,  
   p3             in varchar2 default null,  
   p4             in varchar2 default null,  
   p5             in varchar2 default null,  
   p6             in varchar2 default null,  
   p7             in varchar2 default null,  
   p8             in varchar2 default null,  
   p9             in varchar2 default null,  
   p_primary_language     in varchar2 default null)  
   return varchar2  
     
 is  
   lv_result VARCHAR2(32767);  
 begin  
   lv_result := apex_190100.htmldb_lang.lang(p_primary_text_string,p0,p1,p2,p3,p4,p5,p6,p7,p8,p9,p_primary_language);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.lang('  
    || 'p_Primary_Text_String=>' || api_trace.Literal(p_Primary_Text_String)  
    || ', p0=>' || api_trace.Literal(p0)  
    || ', p1=>' || api_trace.Literal(p1)  
    || ', p2=>' || api_trace.Literal(p2)  
    || ', p3=>' || api_trace.Literal(p3)  
    || ', p4=>' || api_trace.Literal(p4)  
    || ', p5=>' || api_trace.Literal(p5) || chr(10)  
    || ', p6=>' || api_trace.Literal(p6)  
    || ', p7=>' || api_trace.Literal(p7)  
    || ', p8=>' || api_trace.Literal(p8)  
    || ', p9=>' || api_trace.Literal(p9)  
    || ', p_Primary_Language=>' || api_trace.Literal(p_Primary_Language) || ')' || ' returns '  
    || api_trace.Literal(lv_result), p_level=>5);  
   return lv_result;  
 end lang;  
   
 procedure create_message(  
   p_application_id in number,  
   p_name      in varchar2,  
   p_language    in varchar2,  
   p_message_text  in varchar2 )  
     
 is  
 begin  
   apex_190100.htmldb_lang.create_message(p_application_id,p_name,p_language,p_message_text);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.create_message('  
    || 'p_Application_Id=>' || api_trace.Literal(p_Application_Id)  
    || ', p_Name=>' || api_trace.Literal(p_Name)  
    || ', p_Language=>' || api_trace.Literal(p_Language)  
    || ', p_Message_Text=>' || api_trace.Literal(p_Message_Text) || ')', p_level=>5);  
 end create_message;  
   
 procedure update_message(  
   p_id      in number,  
   p_message_text in varchar2 )  
 is  
 begin  
   apex_190100.htmldb_lang.update_message(p_id,p_message_text);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.update_message('  
    || 'p_Id=>' || api_trace.Literal(p_Id)  
    || ', p_Message_Text=>' || api_trace.Literal(p_Message_Text) || ')', p_level=>5);  
 end update_message;  
   
 procedure delete_message(  
   p_id in number )  
 is  
 begin  
   apex_190100.htmldb_lang.delete_message(p_id);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.delete_message('  
    || 'p_Id=>' || api_trace.Literal(p_Id) || ')', p_level=>5);  
 end delete_message;  
   
 procedure update_translated_string(  
   p_id    in number,  
   p_language in varchar2,  
   p_string  in varchar2)  
 is  
 begin  
   apex_190100.htmldb_lang.update_translated_string(p_id,p_language,p_string);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.update_translated_string('  
    || 'p_Id=>' || api_trace.Literal(p_Id)  
    || ', p_Language=>' || api_trace.Literal(p_Language)  
    || ', p_String=>' || api_trace.Literal(p_String) || ')', p_level=>5);  
 end update_translated_string;  
   
 procedure seed_translations(  
   p_application_id in number,  
   p_language    in varchar2 )  
 is  
 begin  
   apex_190100.htmldb_lang.seed_translations(p_application_id,p_language);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.seed_translations('  
    || 'p_Application_Id=>' || api_trace.Literal(p_Application_Id)  
    || ', p_Language=>' || api_trace.Literal(p_Language) || ')', p_level=>5);  
 end seed_translations;  
   
 procedure create_language_mapping(  
   p_application_id       in number,  
   p_language          in varchar2,  
   p_translation_application_id in number)  
 is  
 begin  
   apex_190100.htmldb_lang.create_language_mapping(p_application_id,p_language,p_translation_application_id);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.create_language_mapping('  
    || 'p_Application_Id=>' || api_trace.Literal(p_Application_Id)  
    || ', p_Language=>' || api_trace.Literal(p_Language)  
    || ', p_Translation_Application_Id=>' || api_trace.Literal(p_Translation_Application_Id) || ')', p_level=>5);  
 end create_language_mapping;  
   
 procedure update_language_mapping(  
   p_application_id       in number,  
   p_language          in varchar2,  
   p_new_trans_application_id  in number)  
 is  
 begin  
   apex_190100.htmldb_lang.update_language_mapping(p_application_id,p_language,p_new_trans_application_id);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.update_language_mapping('  
    || 'p_Application_Id=>' || api_trace.Literal(p_Application_Id)  
    || ', p_Language=>' || api_trace.Literal(p_Language)  
    || ', p_New_Trans_Application_Id=>' || api_trace.Literal(p_New_Trans_Application_Id) || ')', p_level=>5);  
 end update_language_mapping;  
   
 procedure delete_language_mapping(  
   p_application_id in number,  
   p_language    in varchar2)  
 is  
 begin  
   apex_190100.htmldb_lang.delete_language_mapping(p_application_id,p_language);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.delete_language_mapping('  
    || 'p_Application_Id=>' || api_trace.Literal(p_Application_Id)  
    || ', p_Language=>' || api_trace.Literal(p_Language) || ')', p_level=>5);  
 end delete_language_mapping;  
   
 procedure publish_application(  
   p_application_id      in number,  
   p_language         in varchar2,  
   p_new_trans_application_id in number default null )  
 is  
 begin  
   apex_190100.htmldb_lang.publish_application(p_application_id,p_language,p_new_trans_application_id);  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.publish_application('  
    || 'p_Application_Id=>' || api_trace.Literal(p_Application_Id)  
    || ', p_Language=>' || api_trace.Literal(p_Language)  
    || ', p_New_Trans_Application_Id=>' || api_trace.Literal(p_New_Trans_Application_Id) || ')', p_level=>5);  
 end publish_application;  
   
 procedure emit_language_selector_list  
 is  
 begin  
   apex_190100.htmldb_lang.emit_language_selector_list;  
     apex_debug.log_long_message(p_message=>'API: ' || 'apex_lang.emit_language_selector_list', p_level=>5);  
 end emit_language_selector_list;  
   
 END APEX_LANG;  
 /  
   


For some APEX APIs you have to grant access to the following objects:

GRANT EXECUTE ON APEX_<version>.WWV_FLOW_SECURITY TO <schema_name>;

GRANT EXECUTE ON APEX_<version>.WWV_FLOW_THEMES TO <schema_name>;

GRANT EXECUTE ON APEX_<version>.WWV_FLOW_THEME_STYLES TO <schema_name>;

Start an APEX application that you want to trace and execute a page with debugging LEVEL5 enabled. The debug log will now contain entries that list the invocations with input and output parameters for the prepared packages. 

Example APEX Debug Log with API Calls:


Trace Code for LOCAL packages

Here you can list invocations to the package api_trace for your own local packages.  

The switches 'Compact' and 'Logging of Procedure Start' control the code generation for your need.

Copy the listed code into your package to enable tracing of your procedures and functions.

The generated code uses the api_trace. Literal function to convert the passed values to quoted literals.

The function api_trace.Literal is overloaded to support many data types. Datatypes that can not be converted like records are logged with a <datatype x> placeholder. The logged values are truncated to a maximum length.

The compact form of the generated code is executed directly with the simple passing of the function arguments list. 

The number of arguments will be checked at runtime to match the count of arguments of the current calling function.

Publish Application Objects in other Schema

Here you can produce a mirror copy of a schema.
The target schema name should be a new empty schema, that was created in the APEX Admin / Workspace to Schema Assignment page.
In the copy tables and views are present as views that access the original tables and views.
Other objects like packages, functions, procedures, and types are represented as synonyms.
An application that can run in the original schema will be able to run in the copied schema too.

The application can be downloaded in the following Github repository:

https://github.com/dstrack/APEX_API_Tracer

A demo of the app can be found here:

https://yboieokpjit8ejk-strackdev02.adb.eu-frankfurt-1.oraclecloudapps.com/ords/f?p=104:9999

Name: demo Password: Simple/8021


Samstag, 19. Juni 2021

An APEX Plugin for download of Interactive Reports as ZIP files

 APEX processing plugin for downloading large APEX Interactive Reports as ZIP files.

In case you have to download very large reports from an APEX Instance that is running behind a web proxy server or in cases where you have to download reports with more rows than the technical limit allows you to use, you can use this plugin to successfully download these reports. File size up to 2 GB is supported for the compressed data. The Sample App can be installed in APEX 5 or higher.

Usage:

1. Install the plugin file process_type_plugin_com_strack-software_ir_zip_download.sql in your application.

2. add a Button to your Report region.

3. set the Button action to redirect and set the link to the current page with Request = 'ZIP_DOWNLOAD'



4. In the Rendering tab at pre-Rendering / Before Header you add a 'Process' of type 'plugin', choose the 'IR Zip Download' plugin.

    Set the Setting / Region Name to contain the title of the IR Region.

    Set Condition / Type to : REQUEST = Value, set Condition / Value to : ZIP_DOWNLOAD



The plugin performs the following steps:

1. extract the query with APEX_IR.GET_REPORT

2. bind the variables

3. Open cursor for the query

4. convert to CSV.

5. Convert to zip with APEX_ZIP.ADD_FIle

6. Start the download to the browser


The plugin can be downloaded from Github:

A demo app is available here:

Sonntag, 30. Mai 2021

A printer friendly template for the APEX Universal Theme

Oracle APEX enables you to optimize a page for printing: Optimizing a Page for Printing

In the Theme 25. Blue Responsive there exists a template PRINTER_FRIENDLY for printing an APEX page as HTML with the users' web browser in a printer-friendly modus. In the universal Theme 42. Universal Theme this template is missing. An attempt to print for example a large classic report page with Firefox via the Browsers Print Menu printed only the first page followed by empty pages. The responsive design CSS rules seem to cause this problem.

The printer_friendly output of multi-page reports in the APEX Universal Theme can be reactivated with a custom page template, a menu entry, and by printing with a jQuery plugin. These are the steps to implement the printer-friendly modus in an application with a universal theme. 

1. Install jQuery.print.js as an Application file

Download the file jQuery.print and then upload this file into your Shared Components / Static Application Files


After uploading the following file should be listed.



2. prepare the page template

Go to shared components Templates and filter the list with Theme = 42, Type = page. Then Copy the template named 'Minimal (No Navigation)' by clicking on the copy Icon of that row. Name the new template 'Printer Friendly'. Click Copy the save it.


3. Edit the new page template

Set the Template Class: Printer Friendly

In the entry for body paste the following HTML code. Above the #BODY# substitution tag, two buttons Back and Print have been inserted.
 <div class="t-Body">  
  <div class="t-Body-main">  
    <div class="t-Body-title" id="t_Body_title">  
     #REGION_POSITION_01#  
    </div>  
    <div class="t-Body-content" id="t_Body_content">  
     <div class="t-Body-contentInner" role="main">  
     <button class="t-Button t-Button--noLabel t-Button--icon" onclick="history.back();" type="button" id="GO_BACK" title="Go Back" aria-label="Go Back">  
      <span class="t-Icon fa fa-arrow-left" aria-hidden="true">  
      </span>  
     </button>  
     <button class="t-Button t-Button--noLabel t-Button--icon" onclick="$.print(print_region);" type="button" id="PRINT_REPORT" title="Print Report" aria-label="Print Report">  
      <span class="t-Icon fa fa-print" aria-hidden="true">  
      </span>  
     </button>  
      #BODY#  
     </div>  
     <footer class="t-Footer" role="contentinfo">  
      <div class="t-Footer-body">  
       <div class="t-Footer-content">#REGION_POSITION_05#</div>  
      </div>  
     </footer>  
    </div>  
  </div>  
 </div>  
 <div class="t-Body-inlineDialogs">  
  #REGION_POSITION_04#  
 </div>  
At runtime, the buttons Back and Print will be displayed at the top of the page.



In the section 'Javascript' set
- Files URLs: 
#APP_IMAGES#jQuery.print.js

- Functions and Global Variable Declaration:
var print_region = 'div.t-Body-contentInner';

- Execute when Page Loads:
apex.theme42.initializePage.noSideCol();
$(document).ready(function() {
    window.setTimeout(function() {
        $.print(print_region);
    }, 100);
});


In the section 'Cascading Style Sheet' set 
- inline:
@media print {
  button#PRINT_REPORT,button#GO_BACK {
   display: none;
  }
 }

4. Edit the Theme Component defaults

In Shared Components click on Themes. In the Themes list click on the Name 'Universal Theme - 42'. Now set the Printer Friendly Page to  'Printer Friendly' and Save.



5. Prepare the menu entry

In Shared Components click on Navigation Bar List. In the Lists Report click on the Name 'Navigation Bar'. Now click on 'Create Entry'. Set the following items:

Image/Class: fa-print

List Entry Label: Print

Target type: Page in this Application

Page : &APP_PAGE_ID.

reset pagination for this page: enabled

Printer Friendly: enabled

Condition Type: User is Authenticated (not public) – to avoid display on the login page



After saving the Menu Entry the following line should be shown in the List Details:



When you run your application you will see a Print Icon in the upper-right of your web page. 

When you click this button the printer-friendly page will be loaded. The page will open the browser's printer dialog popup window. Here you can further optimize the output and redirect the output to a pdf file.

You can specify a print region for each page by overwriting the javascript variable print_region. For example, set the Static ID of a page region to TABLE_DATA_HEAD and then add the following line the the 'Function and Global Variable declaration': 

var print_region = $('div#TABLE_DATA_VIEW');


The template and its functionality is demonstrated in the Schema & Data Browser APEX application that has been released on 

Freitag, 11. Dezember 2020

Schema & Data Browser - Part 4

A new Beta Release Version 1.9.10 of the Schema & Data Browser APEX application has been released on https://github.com/dstrack/Schema_Data_Browser


Many of the concepts of the data browser that were described in the previous blogpost are now implemented and can be demonstrated.

Data Browser Improvements:

  • Improved query generation for import/export views and for history, counters, links, totals, and folder path LOVs.
  • Better joins descriptions for Import / Export forms in the Actions / Columns modal dialog.
  • Improved navigation in the Actions / Show Query modal dialog. The view mode can be chosen in a tab-pane.
  • Improved trigger generation for updatable views. Added a key lookup for composite keys with a file path.
  • Faster installation and refresh of materialized views.
  • Improved page load performance for home page and data browser.
  • Improved progress bar display for scheduler jobs.
  • Improved query generation for LOVs. Up to 4 levels of recursion are now resolved to build hierarchical and path descriptions.
  • Manage Schema Tasks: The processes to add primary and natural keys have been improved. New buttons enable you to generate and download updatable views and UI-defaults.
  • Improved refresh of report regions.
  • Improved Dialog for Sorting and Grouping
  • Early validation of entered data against the schema constraints and size limits via AJAX calls is now enabled.
  • On the home page, the list of My Databases is now displayed when the application is using APEX authentication. This list informed you about the accessible schemas and their space usage and it enables you to switch the active schema by clicking on an entry. Multiple schemas have to be assigned to the APEX workspace and the supporting objects of the application have to be installed in those schemas to make them accessible for this application.

Sources:

  • The PL/SQL packages weco_mail, weco_auth_mgr, and table app_preferences have been removed.
  • The Pre-Installation-Validations for the privileges to execute dbms_lock, dbms_tcp, and dbms_smtp have been removed.
  • In the custom authorization scheme, the package apex_mail is used instead of weco_mail.
  • All of the PL/SQL and javascript source files that are installed by the application are listed in the repository sources directory of the git repository.
  • Removed references to apex_ajax and instead use apex.server.process and plugin apex-plugin-clob-load.
  • the demo time period is now 2 months.

Setting up a new Schema
This application enables you and other users with an APEX developer account in your workspace to add new schemas at runtime. I believe that it is a got practice to create at least one new schema name for each independent application. When you have watched the video from the fabulous Mr. Connor McDonald 'From Doors Wide Open ... to By Invitation Only!' you may come to the same conclusion.
Besides the advantages of modularization and separation of concerns, you can get rid of prefixes for your table names and can use namespaces instead. For example, the table name oehr_Employees becomes oehr.Employees.
The following video demonstrates how to add a new (database) schema named 'HR Data'. the user has to enter a password for the database user, that can be used in SQL-Developer and a second password for the current user for the schema admin account (that is stored as a hash value in the table app_users in that schema.


The video demonstrated the Installation of a sample dataset and setting of the table prefixes to improve the displayed names. When that is done we click on the entry 'Hr Data' in the 'My Databases' list to activate the database as current. A background job will be started that refreshes the table and columns system statistics and builds a fast access cache from the data dictionary system views in a set of materialized views. 


The next video is about Manage Schema Task: Define recursive natural keys to produce detailed record descriptions and define mandatory key columns to improve the key lookup in the triggers of updateable views. Then see an example updatable view Voehr_Employees_Imp with a trigger to handle the DML operations. 


 CREATE OR REPLACE VIEW VOEHR_EMPLOYEES_IMP  
   ( LINK_ID$, FIRST_NAME, LAST_NAME, EMAIL, PHONE_NUMBER, HIRE_DATE, JOB_JOB_ID, SALARY, COMMISSION_PCT, MANAGER_EMAIL, DEPARTMENT_DEPARTMENT_NAME, DEPARTMENT_LOC_STREET_ADDRESS, DEPARTMENT_LOCATI_POSTAL_CODE, DEPARTMENT_LOCATION_CITY, DEPARTMENT_LOC_STATE_PROVINCE, DEPARTMENT_LOCAT_COUNTRY_NAME, DEPARTMENT_LOCATI_REGION_NAME  
   , CONSTRAINT VOEHR_EMPLOYEES_IMP_PK PRIMARY KEY (LINK_ID$) RELY DISABLE )   
  AS   
 SELECT    
   A.EMPLOYEE_ID LINK_ID$,  
   A.FIRST_NAME,  
   A.LAST_NAME,  
   A.EMAIL,  
   A.PHONE_NUMBER,  
   A.HIRE_DATE,  
   B.JOB_ID JOB_JOB_ID,  
   A.SALARY,  
   A.COMMISSION_PCT,  
   C.EMAIL MANAGER_EMAIL,  
   D.DEPARTMENT_NAME DEPARTMENT_DEPARTMENT_NAME,  
   D_C.STREET_ADDRESS DEPARTMENT_LOC_STREET_ADDRESS,  
   D_C.POSTAL_CODE DEPARTMENT_LOCATI_POSTAL_CODE,  
   D_C.CITY DEPARTMENT_LOCATION_CITY,  
   D_C.STATE_PROVINCE DEPARTMENT_LOC_STATE_PROVINCE,  
   D_C_B.COUNTRY_NAME DEPARTMENT_LOCAT_COUNTRY_NAME,  
   D_C_B_B.REGION_NAME DEPARTMENT_LOCATI_REGION_NAME  
 FROM OEHR_EMPLOYEES A  
 JOIN OEHR_JOBS B ON B.JOB_ID = A.JOB_ID  
 LEFT OUTER JOIN OEHR_EMPLOYEES C ON C.EMPLOYEE_ID = A.MANAGER_ID  
 LEFT OUTER JOIN OEHR_DEPARTMENTS D ON D.DEPARTMENT_ID = A.DEPARTMENT_ID  
 LEFT OUTER JOIN OEHR_LOCATIONS D_C ON D_C.LOCATION_ID = D.LOCATION_ID  
 LEFT OUTER JOIN OEHR_COUNTRIES D_C_B ON D_C_B.COUNTRY_ID = D_C.COUNTRY_ID  
 LEFT OUTER JOIN OEHR_REGIONS D_C_B_B ON D_C_B_B.REGION_ID = D_C_B.REGION_ID;  
   
 CREATE OR REPLACE TRIGGER VOEHR_EMPLOYEES_IMP_TR INSTEAD OF INSERT OR UPDATE OR DELETE ON VOEHR_EMPLOYEES_IMP FOR EACH ROW   
 DECLARE   
   v_row OEHR_EMPLOYEES%ROWTYPE;  
   v_DEPARTMENT_LOCATION_ID OEHR_LOCATIONS.LOCATION_ID%TYPE;  
   v_DEPARTMENT_LOCATION_REGION_ID OEHR_REGIONS.REGION_ID%TYPE;  
   v_DEPARTMENT_LOCATIO_COUNTRY_ID OEHR_COUNTRIES.COUNTRY_ID%TYPE;  
 BEGIN  
   if DELETING then   
     DELETE FROM OEHR_EMPLOYEES A   
     WHERE A.EMPLOYEE_ID = :new.LINK_ID$;  
     return;  
   end if;  
   v_row.EMPLOYEE_ID           := :new.LINK_ID$;  
   v_row.FIRST_NAME            := :new.FIRST_NAME;  
   v_row.LAST_NAME            := :new.LAST_NAME;  
   v_row.EMAIL              := :new.EMAIL;  
   v_row.PHONE_NUMBER           := :new.PHONE_NUMBER;  
   v_row.HIRE_DATE            := :new.HIRE_DATE;  
   v_row.SALARY              := :new.SALARY;  
   v_row.COMMISSION_PCT          := :new.COMMISSION_PCT;  
   if :new.DEPARTMENT_LOCATI_REGION_NAME IS NOT NULL then   
    begin  
     SELECT D_C_B_B.REGION_ID INTO v_DEPARTMENT_LOCATION_REGION_ID  
     FROM OEHR_REGIONS D_C_B_B   
     WHERE D_C_B_B.REGION_NAME = :new.DEPARTMENT_LOCATI_REGION_NAME;  
    exception when NO_DATA_FOUND then  
     INSERT INTO OEHR_REGIONS(REGION_NAME)  
     VALUES (:new.DEPARTMENT_LOCATI_REGION_NAME)  
     RETURNING (REGION_ID) INTO v_DEPARTMENT_LOCATION_REGION_ID;  
    end;  
   end if;  
   if :new.MANAGER_EMAIL IS NOT NULL then   
     SELECT C.EMPLOYEE_ID INTO v_row.MANAGER_ID  
     FROM OEHR_EMPLOYEES C   
     WHERE C.EMAIL = :new.MANAGER_EMAIL;  
   end if;  
   if :new.JOB_JOB_ID IS NOT NULL then   
     SELECT B.JOB_ID INTO v_row.JOB_ID  
     FROM OEHR_JOBS B   
     WHERE B.JOB_ID = :new.JOB_JOB_ID;  
   end if;  
   if :new.DEPARTMENT_LOCAT_COUNTRY_NAME IS NOT NULL  
   or v_DEPARTMENT_LOCATION_REGION_ID IS NOT NULL then   
     SELECT D_C_B.COUNTRY_ID INTO v_DEPARTMENT_LOCATIO_COUNTRY_ID  
     FROM OEHR_COUNTRIES D_C_B   
     WHERE (D_C_B.COUNTRY_NAME = :new.DEPARTMENT_LOCAT_COUNTRY_NAME OR D_C_B.COUNTRY_NAME IS NULL AND :new.DEPARTMENT_LOCAT_COUNTRY_NAME IS NULL)  
     AND (D_C_B.REGION_ID = v_DEPARTMENT_LOCATION_REGION_ID OR D_C_B.REGION_ID IS NULL AND v_DEPARTMENT_LOCATION_REGION_ID IS NULL);  
   end if;  
   if :new.DEPARTMENT_LOC_STREET_ADDRESS IS NOT NULL  
   or :new.DEPARTMENT_LOCATI_POSTAL_CODE IS NOT NULL  
   or :new.DEPARTMENT_LOCATION_CITY IS NOT NULL  
   or :new.DEPARTMENT_LOC_STATE_PROVINCE IS NOT NULL  
   or v_DEPARTMENT_LOCATIO_COUNTRY_ID IS NOT NULL then   
    begin  
     SELECT D_C.LOCATION_ID INTO v_DEPARTMENT_LOCATION_ID  
     FROM OEHR_LOCATIONS D_C   
     WHERE (D_C.STREET_ADDRESS = :new.DEPARTMENT_LOC_STREET_ADDRESS OR D_C.STREET_ADDRESS IS NULL AND :new.DEPARTMENT_LOC_STREET_ADDRESS IS NULL)  
     AND (D_C.POSTAL_CODE = :new.DEPARTMENT_LOCATI_POSTAL_CODE OR D_C.POSTAL_CODE IS NULL AND :new.DEPARTMENT_LOCATI_POSTAL_CODE IS NULL)  
     AND (D_C.CITY = :new.DEPARTMENT_LOCATION_CITY OR :new.DEPARTMENT_LOCATION_CITY IS NULL)  
     AND (D_C.STATE_PROVINCE = :new.DEPARTMENT_LOC_STATE_PROVINCE OR D_C.STATE_PROVINCE IS NULL AND :new.DEPARTMENT_LOC_STATE_PROVINCE IS NULL)  
     AND (D_C.COUNTRY_ID = v_DEPARTMENT_LOCATIO_COUNTRY_ID OR D_C.COUNTRY_ID IS NULL AND v_DEPARTMENT_LOCATIO_COUNTRY_ID IS NULL);  
    exception when NO_DATA_FOUND then  
     INSERT INTO OEHR_LOCATIONS(STREET_ADDRESS, POSTAL_CODE, CITY, STATE_PROVINCE, COUNTRY_ID)  
     VALUES (:new.DEPARTMENT_LOC_STREET_ADDRESS, :new.DEPARTMENT_LOCATI_POSTAL_CODE, :new.DEPARTMENT_LOCATION_CITY, :new.DEPARTMENT_LOC_STATE_PROVINCE, v_DEPARTMENT_LOCATIO_COUNTRY_ID)  
     RETURNING (LOCATION_ID) INTO v_DEPARTMENT_LOCATION_ID;  
    end;  
   end if;  
   if :new.DEPARTMENT_DEPARTMENT_NAME IS NOT NULL  
   or v_DEPARTMENT_LOCATION_ID IS NOT NULL then   
    begin  
     SELECT D.DEPARTMENT_ID INTO v_row.DEPARTMENT_ID  
     FROM OEHR_DEPARTMENTS D   
     WHERE (D.DEPARTMENT_NAME = :new.DEPARTMENT_DEPARTMENT_NAME OR :new.DEPARTMENT_DEPARTMENT_NAME IS NULL)  
     AND (D.LOCATION_ID = v_DEPARTMENT_LOCATION_ID OR D.LOCATION_ID IS NULL AND v_DEPARTMENT_LOCATION_ID IS NULL);  
    exception when NO_DATA_FOUND then  
     INSERT INTO OEHR_DEPARTMENTS(DEPARTMENT_NAME, LOCATION_ID)  
     VALUES (:new.DEPARTMENT_DEPARTMENT_NAME, v_DEPARTMENT_LOCATION_ID)  
     RETURNING (DEPARTMENT_ID) INTO v_row.DEPARTMENT_ID;  
    end;  
   end if;  
   
   if INSERTING then   
     INSERT INTO OEHR_EMPLOYEES VALUES v_row;  
   else   
     UPDATE OEHR_EMPLOYEES SET   
       FIRST_NAME = v_row.FIRST_NAME,  
       LAST_NAME = v_row.LAST_NAME,  
       EMAIL = v_row.EMAIL,  
       PHONE_NUMBER = v_row.PHONE_NUMBER,  
       HIRE_DATE = v_row.HIRE_DATE,  
       JOB_ID = v_row.JOB_ID,  
       SALARY = v_row.SALARY,  
       COMMISSION_PCT = v_row.COMMISSION_PCT,  
       MANAGER_ID = v_row.MANAGER_ID,  
       DEPARTMENT_ID = v_row.DEPARTMENT_ID  
     WHERE EMPLOYEE_ID = :new.LINK_ID$;  
   end if;  
 END VOEHR_EMPLOYEES_IMP_TR;  
 /

When the columns of the natural keys are defined as mandatory (NOT NULL), then the lookup operations require fewer OR conditions to handle NULL cases.
   

CREATE OR REPLACE VIEW VOEHR_EMPLOYEES_IMP  
   ( LINK_ID$, FIRST_NAME, LAST_NAME, EMAIL, PHONE_NUMBER, HIRE_DATE, JOB_JOB_ID, SALARY, COMMISSION_PCT, MANAGER_EMAIL, DEPARTMENT_DEPARTMENT_NAME, DEPARTMENT_LOC_STREET_ADDRESS, DEPARTMENT_LOCATI_POSTAL_CODE, DEPARTMENT_LOCATION_CITY, DEPARTMENT_LOC_STATE_PROVINCE, DEPARTMENT_LOCAT_COUNTRY_NAME, DEPARTMENT_LOCATI_REGION_NAME  
   , CONSTRAINT VOEHR_EMPLOYEES_IMP_PK PRIMARY KEY (LINK_ID$) RELY DISABLE )   
  AS   
 SELECT    
   A.EMPLOYEE_ID LINK_ID$,  
   A.FIRST_NAME,  
   A.LAST_NAME,  
   A.EMAIL,  
   A.PHONE_NUMBER,  
   A.HIRE_DATE,  
   B.JOB_ID JOB_JOB_ID,  
   A.SALARY,  
   A.COMMISSION_PCT,  
   C.EMAIL MANAGER_EMAIL,  
   D.DEPARTMENT_NAME DEPARTMENT_DEPARTMENT_NAME,  
   D_C.STREET_ADDRESS DEPARTMENT_LOC_STREET_ADDRESS,  
   D_C.POSTAL_CODE DEPARTMENT_LOCATI_POSTAL_CODE,  
   D_C.CITY DEPARTMENT_LOCATION_CITY,  
   D_C.STATE_PROVINCE DEPARTMENT_LOC_STATE_PROVINCE,  
   D_C_B.COUNTRY_NAME DEPARTMENT_LOCAT_COUNTRY_NAME,  
   D_C_B_B.REGION_NAME DEPARTMENT_LOCATI_REGION_NAME  
 FROM OEHR_EMPLOYEES A  
 JOIN OEHR_JOBS B ON B.JOB_ID = A.JOB_ID  
 LEFT OUTER JOIN OEHR_EMPLOYEES C ON C.EMPLOYEE_ID = A.MANAGER_ID  
 LEFT OUTER JOIN OEHR_DEPARTMENTS D ON D.DEPARTMENT_ID = A.DEPARTMENT_ID  
 LEFT OUTER JOIN OEHR_LOCATIONS D_C ON D_C.LOCATION_ID = D.LOCATION_ID  
 LEFT OUTER JOIN OEHR_COUNTRIES D_C_B ON D_C_B.COUNTRY_ID = D_C.COUNTRY_ID  
 LEFT OUTER JOIN OEHR_REGIONS D_C_B_B ON D_C_B_B.REGION_ID = D_C_B.REGION_ID;  
   
 CREATE OR REPLACE TRIGGER VOEHR_EMPLOYEES_IMP_TR INSTEAD OF INSERT OR UPDATE OR DELETE ON VOEHR_EMPLOYEES_IMP FOR EACH ROW   
 DECLARE   
   v_row OEHR_EMPLOYEES%ROWTYPE;  
   v_DEPARTMENT_LOCATION_ID OEHR_LOCATIONS.LOCATION_ID%TYPE;  
   v_DEPARTMENT_LOCATION_REGION_ID OEHR_REGIONS.REGION_ID%TYPE;  
   v_DEPARTMENT_LOCATIO_COUNTRY_ID OEHR_COUNTRIES.COUNTRY_ID%TYPE;  
 BEGIN  
   if DELETING then   
     DELETE FROM OEHR_EMPLOYEES A   
     WHERE A.EMPLOYEE_ID = :new.LINK_ID$;  
     return;  
   end if;  
   v_row.EMPLOYEE_ID           := :new.LINK_ID$;  
   v_row.FIRST_NAME            := :new.FIRST_NAME;  
   v_row.LAST_NAME            := :new.LAST_NAME;  
   v_row.EMAIL              := :new.EMAIL;  
   v_row.PHONE_NUMBER           := :new.PHONE_NUMBER;  
   v_row.HIRE_DATE            := :new.HIRE_DATE;  
   v_row.SALARY              := :new.SALARY;  
   v_row.COMMISSION_PCT          := :new.COMMISSION_PCT;  
   if :new.DEPARTMENT_LOCATI_REGION_NAME IS NOT NULL then   
    begin  
     SELECT D_C_B_B.REGION_ID INTO v_DEPARTMENT_LOCATION_REGION_ID  
     FROM OEHR_REGIONS D_C_B_B   
     WHERE D_C_B_B.REGION_NAME = :new.DEPARTMENT_LOCATI_REGION_NAME;  
    exception when NO_DATA_FOUND then  
     INSERT INTO OEHR_REGIONS(REGION_NAME)  
     VALUES (:new.DEPARTMENT_LOCATI_REGION_NAME)  
     RETURNING (REGION_ID) INTO v_DEPARTMENT_LOCATION_REGION_ID;  
    end;  
   end if;  
   if :new.MANAGER_EMAIL IS NOT NULL then   
     SELECT C.EMPLOYEE_ID INTO v_row.MANAGER_ID  
     FROM OEHR_EMPLOYEES C   
     WHERE C.EMAIL = :new.MANAGER_EMAIL;  
   end if;  
   if :new.JOB_JOB_ID IS NOT NULL then   
     SELECT B.JOB_ID INTO v_row.JOB_ID  
     FROM OEHR_JOBS B   
     WHERE B.JOB_ID = :new.JOB_JOB_ID;  
   end if;  
   if :new.DEPARTMENT_LOCAT_COUNTRY_NAME IS NOT NULL  
   and v_DEPARTMENT_LOCATION_REGION_ID IS NOT NULL then   
     SELECT D_C_B.COUNTRY_ID INTO v_DEPARTMENT_LOCATIO_COUNTRY_ID  
     FROM OEHR_COUNTRIES D_C_B   
     WHERE D_C_B.COUNTRY_NAME = :new.DEPARTMENT_LOCAT_COUNTRY_NAME  
     AND D_C_B.REGION_ID = v_DEPARTMENT_LOCATION_REGION_ID;  
   end if;  
   if :new.DEPARTMENT_LOC_STREET_ADDRESS IS NOT NULL  
   or :new.DEPARTMENT_LOCATI_POSTAL_CODE IS NOT NULL  
   or :new.DEPARTMENT_LOCATION_CITY IS NOT NULL  
   or :new.DEPARTMENT_LOC_STATE_PROVINCE IS NOT NULL  
   or v_DEPARTMENT_LOCATIO_COUNTRY_ID IS NOT NULL then   
    begin  
     SELECT D_C.LOCATION_ID INTO v_DEPARTMENT_LOCATION_ID  
     FROM OEHR_LOCATIONS D_C   
     WHERE (D_C.STREET_ADDRESS = :new.DEPARTMENT_LOC_STREET_ADDRESS OR :new.DEPARTMENT_LOC_STREET_ADDRESS IS NULL)  
     AND (D_C.POSTAL_CODE = :new.DEPARTMENT_LOCATI_POSTAL_CODE OR D_C.POSTAL_CODE IS NULL AND :new.DEPARTMENT_LOCATI_POSTAL_CODE IS NULL)  
     AND (D_C.CITY = :new.DEPARTMENT_LOCATION_CITY OR :new.DEPARTMENT_LOCATION_CITY IS NULL)  
     AND (D_C.STATE_PROVINCE = :new.DEPARTMENT_LOC_STATE_PROVINCE OR D_C.STATE_PROVINCE IS NULL AND :new.DEPARTMENT_LOC_STATE_PROVINCE IS NULL)  
     AND (D_C.COUNTRY_ID = v_DEPARTMENT_LOCATIO_COUNTRY_ID OR v_DEPARTMENT_LOCATIO_COUNTRY_ID IS NULL);  
    exception when NO_DATA_FOUND then  
     INSERT INTO OEHR_LOCATIONS(STREET_ADDRESS, POSTAL_CODE, CITY, STATE_PROVINCE, COUNTRY_ID)  
     VALUES (:new.DEPARTMENT_LOC_STREET_ADDRESS, :new.DEPARTMENT_LOCATI_POSTAL_CODE, :new.DEPARTMENT_LOCATION_CITY, :new.DEPARTMENT_LOC_STATE_PROVINCE, v_DEPARTMENT_LOCATIO_COUNTRY_ID)  
     RETURNING (LOCATION_ID) INTO v_DEPARTMENT_LOCATION_ID;  
    end;  
   end if;  
   if :new.DEPARTMENT_DEPARTMENT_NAME IS NOT NULL  
   and v_DEPARTMENT_LOCATION_ID IS NOT NULL then   
    begin  
     SELECT D.DEPARTMENT_ID INTO v_row.DEPARTMENT_ID  
     FROM OEHR_DEPARTMENTS D   
     WHERE D.DEPARTMENT_NAME = :new.DEPARTMENT_DEPARTMENT_NAME  
     AND D.LOCATION_ID = v_DEPARTMENT_LOCATION_ID;  
    exception when NO_DATA_FOUND then  
     INSERT INTO OEHR_DEPARTMENTS(DEPARTMENT_NAME, LOCATION_ID)  
     VALUES (:new.DEPARTMENT_DEPARTMENT_NAME, v_DEPARTMENT_LOCATION_ID)  
     RETURNING (DEPARTMENT_ID) INTO v_row.DEPARTMENT_ID;  
    end;  
   end if;  
   
   if INSERTING then   
     INSERT INTO OEHR_EMPLOYEES VALUES v_row;  
   else   
     UPDATE OEHR_EMPLOYEES SET   
       FIRST_NAME = v_row.FIRST_NAME,  
       LAST_NAME = v_row.LAST_NAME,  
       EMAIL = v_row.EMAIL,  
       PHONE_NUMBER = v_row.PHONE_NUMBER,  
       HIRE_DATE = v_row.HIRE_DATE,  
       JOB_ID = v_row.JOB_ID,  
       SALARY = v_row.SALARY,  
       COMMISSION_PCT = v_row.COMMISSION_PCT,  
       MANAGER_ID = v_row.MANAGER_ID,  
       DEPARTMENT_ID = v_row.DEPARTMENT_ID  
     WHERE EMPLOYEE_ID = :new.LINK_ID$;  
   end if;  
 END VOEHR_EMPLOYEES_IMP_TR;  
 /  

In the case that you want to use this kind of named views in your application, you can press the button 'Generate Updatable Views' to install a set of views with the name V<Table_Name>_IMP as a logical projection layer of your tables. In the next step, you can click the 'Export UI Defaults' button to export detailed properties for the GUI field for new forms on those tables or views. You then have to import the downloaded file hr_data_uidefaults.sql in your workspace. The Demo App EMPS demonstrates how the view Voehr_Employees_Imp can be used in a Facetted Search, edited in an Interactive Grid, Interactive Report, and Data Entry Form. When you use this view in a data loading process, you can not only load data into the table Oehr_Employees, but also load new entries into Oehr_Regions, Oehr_Locations, and Oehr_Departments. Not a single line of SQL code had to be written in this demo app.

Add Calculations: Add a calculated column Netto_Price and define summands to produce totals (Order Items - ... Netto Price, and Employee - ... Salary) in grouped reports of referenced tables.


Add Audit columns to tables with a name that starts with OEHR. The columns Created_At, Created_By,  Last_Modified_At, and Last_Modified_By are added to the tables. The values are managed by default on null function calls in the column definition and with BEFORE UPDATE triggers.

The DDL statements that were performed to alter your schema are stored in the table app_protocol and can be exported with this app.

I hope you find some inspiration and a use case with this app. I welcome contributions and other feedback on github.com like stars and issues. I have got interested in implementing a practical kind of updatable views that were explained by E.F.Codd in the book 'The Relational Model for Database Management Version 2', Page 299, Chapter 17.3 View-updatability Algorithms VU-1 and VU-2. 
When a database application has fast access to the data dictionary constraint definitions, it can provide much more automatic functionality than just expanding a table name to a report or form on the raw data structures.

Regards
Dirk Strack

Sonntag, 27. September 2020

List of values (LOVs) for your APEX app from CHECK (COL IN (...)) constraints

Some time ago a customer asked me to build him an APEX app with a special feature. The CHECK constraints of the application's SQL tables can be altered in the future, and the APEX form pages should require no changes in that case. One of the problems I had to solve was to use dynamic LOV definitions instead of static LOV definitions since they will outdate. For columns with CHECK (COL IN (...) constraints, the view V_LOV_FROM_CHECK_IN_LIST  produces dynamic LOVs with values from the constraint definition. (You must be connected to at least an Oracle Database 18c to use column SEARCH_CONDITION_VC and APEX 5.1 to use the required apex_string.split function.)

 CREATE OR REPLACE VIEW V_LOV_FROM_CHECK_IN_LIST (Table_Name, Column_Name, Display_Value, Column_Value)   
 AS   
  SELECT Table_Name, Column_Name,  
   INITCAP(Column_Value) Display_Value,  
   Column_Value  
 FROM (SELECT Table_Name, Column_Name,   
     REGEXP_REPLACE(TRIM(Column_Value),                      -- remove quotes  
          '^''(.*)''$', '\1'  
     ) Column_Value  
   FROM ( -- convert values to rows  
     SELECT C.Table_Name, B.Column_Name,   
                REGEXP_REPLACE( C.SEARCH_CONDITION_VC,   
                     B.Column_Name || '\s+IN\s*\((.+)\)\s*$',   
                     '\1', 1, 1, 'i'  
       ) CHECK_IN_LIST                                    -- extract values list  
     FROM SYS.USER_CONSTRAINTS C  
     JOIN SYS.USER_CONS_COLUMNS B   
      ON C.Constraint_Name = B.Constraint_Name AND C.Table_Name = B.Table_Name  
     WHERE C.CONSTRAINT_TYPE = 'C'                         -- check constraint  
     AND C.Table_Name NOT LIKE 'BIN$%'                     -- this table is not in the recyclebin  
           AND REGEXP_INSTR( C.SEARCH_CONDITION_VC,           -- find CHECK (COL IN (...)) rules  
                B.Column_Name || '\s+IN\s*\(.+\)\s*$',   
       1, 1, 1, 'i'  
     ) > 0   
   ) S,  
   TABLE( apex_string.split(S.CHECK_IN_LIST, ',')) P     -- extract individual (quoted) values  
 );  
 COMMENT ON TABLE V_LOV_FROM_CHECK_IN_LIST IS 'List of values from simple CHECK (COL IN (...)) constraint for each table column.';  

Using the following example table:

 CREATE TABLE TEST_TABLE (   
      ID NUMBER CONSTRAINT TEST_TABLE_PK PRIMARY KEY,   
      ZOOM_FACTOR FLOAT DEFAULT 1 NOT NULL CONSTRAINT TEST_TABLE_ZOOM_FACTOR_CK CHECK ( ZOOM_FACTOR IN (1, 2, 4, 8) ),  
      EXCLUDE_SINGLES VARCHAR2(5) DEFAULT 'NO' NOT NULL CONSTRAINT TEST_TABLE_EXCL_SINGLE_CK CHECK ( Exclude_Singles IN ('YES','NO') ),  
      EDGE_LABELS VARCHAR2(5) DEFAULT 'YES' NOT NULL      CONSTRAINT TEST_TABLE_EDGE_LABELS_CK CHECK ( Edge_Labels IN ('YES','NO','BOXES') ),  
      EXCITE_METHOD VARCHAR2(50) DEFAULT 'none' NOT NULL CONSTRAINT TEST_TABLE_EXCITE_METHOD_CK   
           CHECK (EXCITE_METHOD IN ('none', 'selected', 'downstream', 'upstream', 'connected'))  
 );  

Now you can replace the static LOV for column EXCITE_METHOD for example with a dynamic LOV that is getting the values from your table definitions. You can do this replacement for all table-columns with CHECK (col IN (...)) constraints that are listed by the view.

 SELECT Display_Value d, Column_Value r  
 FROM V_LOV_FROM_CHECK_IN_LIST
WHERE Table_Name = 'TEST_TABLE' AND Column_Name = 'EXCITE_METHOD';

Old static LOV definition
New dynamic LOV definition