Quantcast
Channel: Andrej Baranovskij Blog
Viewing all 685 articles
Browse latest View live

ADF BC Locking Lifetime and Application Module Pooling

$
0
0
ADF BC API provides method to lock current row, but lock lifetime can be short. Lock method issues SQL lock and locks the record for current user. If your use case is to keep lock for longer period of time, in other words to reserve record by the user until commit or rollback - this is not the method you should use. Lock issued from ADF BC can be released automatically, before user will be committing his changes. This happens when DB pooling is enabled or when there are more concurrent users than AM pool can handle. If you want to make sure that record will be reserved by the user for certain period of time, you must implement custom flag column in the DB and update it separately.

In this post I will demo, why you should avoid using ADF BC lock method to reserve current row by the user. Here you can download sample application - LockApp.zip.

VO implements a custom method exposed to the Data Control, where ADF BC API lock() method is called for current row - this issues SQL lock in the DB for current row:


User A opens ADF form, selects record with ID = 102 and invokes our method to lock the record:


We can see SQL statement executed in the DB - lock is set successfully:


User B opens ADF form, selects record with ID = 102 and invokes lock method - error is generated. Since user A still holds a lock:


We can see this from SQL log - lock attempt was not successful:


Lock remains for user A only in perfect scenario. In real life, when more users will be accessing your system, at some point AM pooling will start working and AM instances will be switching between users. When AM instance is switched, DB connection is reset and current lock will be lost.

In order to simulate loosing of lock, I set Referenced Pool Size = 1. This means AM will support only 1 user, if there will be more users - AM pooling will be working and AM instance will be shared between users:


User B can set a lock now, even before user A releases his lock. This means user A will be loosing reserved row before doing a commit/rollback:


ADF Essentials 12c ADF BC 'BO_SP' ORA-01086 Error

$
0
0
It seems like there are differences running same ADF application on WebLogic and on Glassfish runtime. These differences are minor but could cause some headache to the developers. In this particular case, I would like to describe ORA-01086: savepoint 'BO-SP' never established error. Typically this error could happen, if PS_TXN table is unavailable. However, this is not the case with ADF Essentials 12c - PS_TXN table exists and application is able to access it. 'BO_SP' error is generated when DB constraint is triggered, instead of returning DB error message - ADF Essentials 12c displays ORA-01086: savepoint 'BO-SP' never established error.

How to reproduce: set Salary attribute value to -1, there is constraint in the DB to check for positive values. Constraint is invoked, but instead of getting constraint message - 'BO_SP' error happens:


Workaround is to enable DB pooling for the Application Module, by setting Disconnect Application Module Upon Release property:


This is recommended setting for ADF BC performance and Data Source connections usage optimization: Stress Testing Oracle ADF BC Applications - Do Connection Pooling and TXN Disconnect Level. Is hard to say why exactly setting DB pooling fixes 'BO_SP' error for ADF Essentials 12c, I guess is related to DB driver used in Glassfish. With DB pooling enabled, DB connection is released after request - perhaps this allows to avoid rolling back transaction to BO_SP and in turn fixes the error.

DB constraint is displayed correctly with DB pooling enabled:


Download sample application - ListViewAppEssentials.zip.

ADF 12c - Target Tag to Enhance PPR Rendering

$
0
0
There is new tag in ADF 12c called target. This tag in some use cases can substitute PPR dependencies effectively and help to avoid unwanted validation errors in data entry process. More about this tag you can read from ADF Faces 12c documentation - 8.3 Using the Target Tag to Execute PPR.

I will demo use case of LOV and dependent mandatory input text field. Selected value from LOV should update mandatory text value. With ADF 11g PPR dependency you will get validation error for required field, before LOV will be loaded - since LOV will try to refresh dependent input text even on load, not only during return.

Here you can see typical PPR dependency between LOV and input field. LOV is set with auto submit and there is partial trigger specified for input text:


We insert new row to make all fields to be blank:


Try to open LOV, validation error for dependent mandatory field will be displayed:


Let's use target tag as per documentation referenced in above link. There is no need to set auto submit and partial trigger dependencies. Only define target tag to be applied for LOV and render changes in text field:


LOV opens without triggering validation errors in required field:


Selected value is returned from LOV and set for dependent text field automatically:


Placeholder - another new feature in ADF Faces 12c. We can provide help text describing empty field:


This is just another property for the input text component:


Download sample application - TargetRefreshApp.zip.

ADF Task Flow Template Improvements in 12c

$
0
0
There are great improvements in ADF task flow templates in 12c release. We can great ADF task flow template based on other template and what is even more amazing - JDeveloper 12c ADF task flow diagram window displays template contents when editing actual consuming task flow. There is option to substitute activity in the consuming ADF task flow for the generic activity from the template. I'm going to explain how you can do this.

Here is sample application for ADF task flow template in 12c - TaskFlowTemplateApp.zip. This application implements following ADF task flow template:


As you can see template contains generic router activity - it check how task flow should be loaded and if new row should be inserted. There are method calls for Commit and Rollback operations, meaning no need to add Commit and Rollback operations into Page Definition, will be called directly from the task flow. Activity - fragmentView is defined, but not created, we are going to substitute if the real fragment a bit later in the consuming task flow.

There are three page definitions created for activities in the template: create, commit and rollback operations:


We can create our task flow, based on available ADF task flow template:


Here you can see 12c new feature - task flow editor displays contents from the template. This is amazing and very useful feature:


All the activities can be reused from the template. Only page fragment is substituted with the real one, meaning every consuming task flow can have its own fragment. In order to substitute the fragment you need to go to the source code mode and override activity ID from the template (fragmentView in this example) pointing to fragment you want to load in this particular ADF task flow:


Fragment can be created in the same way as in ADF 11g:


Page is loaded and user can commit/rollback through generic activity methods implemented in the ADF task flow template:

Red Samurai Performance Audit Tool - OOW 2013 release (v 1.1)

$
0
0
We are running our Red Samurai Performance Audit tool and monitoring ADF performance in various projects already for about one year and the half. It helps us a lot to understand ADF performance bottlenecks and tune slow ADF BC View Objects or optimise large ADF BC fetches from DB.

There is special update implemented for OOW'13 - advanced ADF BC statistics are collected directly from your application ADF BC runtime and later displayed as graphical information in the dashboard. I will be attending OOW'13 in San Francisco, feel free to stop me and ask about this tool - I will be happy to give it away and explain how to use it in your project.

Original audit screen with ADF BC performance issues, this is part of our Audit console application:


Audit console v1.1 is improved with one more tab - Statistics. This tab displays all SQL Selects statements produced by ADF BC over time, logged users, AM access load distribution and number of AM activations along with user sessions.

Available graphs:

1. Daily Queries  - total number of SQL selects per day

2. Hourly Queries - Last 48 Hours

3. Logged Users - total number of user sessions per day

4. SQL Selects per Application Module - workload per Application Module

5. Number of Activations and User sessions - last 48 hours - displays stress load


6. Daily Transactions - insert, update and delete statements per day

7. Hourly Transactions - Last 48 Hours


See you at OOW'13 !

Understanding ADF Task Flow Page Flow Scope Lifetime

$
0
0
I would say its a bit confusing to understand - how long objects stored in Page Flow Scope really reside in memory. This is a reason I implemented small sample application and did a test. Summary for the results (see detailed description below):

1. every ADF task flow instance is granted with its own Page Flow Scope

2. Page Flow Scope is not destroyed when you navigate away from the task flow

3. You can access previously left Page Flow Scope only by using Task Flow Return activity

This allows me to presume that is not really good to have many small ADF task flows in the system, as theoretically there will be a lot of wasted Page Flow Scope entries, especially when navigating without retuning back. On contrary, when using larger ADF task flows and less of them - there will be less Page Flow Scope memory wasted.

Sample application is available for download - PageFlowScopeApp.zip. It implements two ADF task flows. Task flow A contains one fragment, task flow B call and return activity:


Task flow B contains one fragment, return activity. There is one trick here - task flow B contains a call to calling task flow A. This is done on purpose, to simulate second instance for the task flow A:


Task flow A defines input parameter - backRendered. This is needed to distinguish, if task flow instance is opened from task flow B or no:


Task flow A call from task flow B sets parameter value:


Based on this parameter value, we can render components conditionally in task flow A fragment. For example, Back button is hidden if parameter is not set (which means A task flow instance was created not through a task flow call from B):


Below are the steps for the test. Task flow A loaded with its fragment, here user types some text and stores it in Page Flow Scope:


Press Navigate button to open instance of task flow B:


Page Flow scope for task flow B is created now. You are in the task flow B page flow scope, store text here also:


Press Next to open second instance of task flow A:


In the second instance of task flow A - type different text, not the same you was typing while in the first instance initially:


You should notice Back button rendered, this is because of passed parameter. Now when different text was set, what would you expect to see when coming back to the first instance of the same task flow? As every task flow instance maintains its own Page Flow Scope - you should expect to see original text. Press Back button - this will navigate to calling task flow B:


Task flow B still displays previously set text - this means when you navigate away from task flow - page flow scope is not last and remains in memory. You should keep this in mind when designing ADF task flows, implementing your forms - make sure you don't store garbage in Page Flow Scope - this may waste memory, especially when having many ADF task flows and navigating between them, without returning. Page Flow Scope from task flow B keeps its value, press Back button to navigate to the first instance of task flow A:


Task flow A keeps its original Page Flow Scope - text value is saved and displayed:


This means - we can return back through task flow return activity and access previously initialised Page Flow Scope - it is not gone.

If you press Navigate button in the task flow A and go to task flow B - new instance of task flow B will be created with new Page Flow Scope (but previous instance will remain in memory it seems like as well):


If you open task flow B by task flow call from task flow B - new instance is created for task flow A, new Page Flow Scope is initialised:


Navigate back to the task flow B and from task flow B to the original task flow A instance:


You will see that page flow scope of original first task flow A instance is available:

Dynamic Task Flow Template Actions in ADF 12c/11g

$
0
0
This will be update for my previous post - ADF Task Flow Template Improvements in 12c, describing how to use dynamic actions in ADF task flow template. Similar dynamic actions can be applied in ADF 11g, this is not limited only to 12c. Dynamic actions allow to build completely reusable ADF task flow template and use it for common use cases, without implementing same actions again and again.

Updated sample application - TaskFlowTemplateApp_v2.zip, is improved with new parameters defined for ADF task flow template. There are two new parameters added - dataControlName and iteratorName, these parameters allow to pass current Data Control Name and Iterator Name for the task flow dynamically:


Create, Commit and Rollback actions are linked with Page Definition files, we can change underlying ADF bindings to be retrieved dynamically based on ADF task flow template parameters:


Action binding for CreateInsert is updated to use dynamic Binds and DataControl properties from parameters:


Action binding for Commit is updated to use dynamic DataControl name property:


Action binding for Rollback is updated to use dynamic DataControl name property:


Region must be configured to pass Data Control and Iterator names for ADF task flow template to be able to provide dynamic actions. These names in real use case scenario can be loaded from DB along with the menu structure:


Sample application is configured to open form in Create mode initially, dynamic action works perfectly:


User can press Undo and navigate to Edit mode - change one of the fields:


Save changes to the DB with dynamic action from ADF task flow template:

ADF Query Saved Search in ADF 12c

$
0
0
ADF Query Saved Search functionality exists starting from ADF 11g - Persisting Query Criteria Results Across Sessions with Oracle MDS. Same works in ADF 12c, and more - from UI point behaviour improved. There are nicer and cleaner dialogs prompting user to save search after changes were applied. You need to remember few tricks to enable Saved Search functionality, I will list all of them in this post.

Firstly goes quite obvious step - enable user customizations across sessions using MDS in ADF View configuration. With this setting, SavedSearchApp.zip application is enabled to store user personalizations into MDS repository (it can be file system or DB):


It is important to enable ADF Security, as user personalizations must be stored for every user under user name context. MDS engine retrieves user name from ADF Security automatically. Screen to enable ADF Security:


One of the last bits - open adf-config.xml and in MDS section specify UserCC class from ADF library:


Specifically for ADF Query Saved Search functionality, make sure to add persistence config into MDS configuration section:


I noticed, when running MDS enabled application for the first time in 12c environment, it may fail with error: metadata store mapped to the namespace / BASE DEFAULT is read only. This can be fixed by adding MDS store class into config. For example, if you are using file store, define FileMetadataStore class as per this example:


Different class should be used in case when MDS repository is created in DB: oracle.mds.persistence.stores.db.DBMetadataStore.

Here how it looks on runtime. Let's assume you go to Advanced mode and add new query field (Email in this example). On the next action, ADF Query is going to ask you one time, if this change needs to be saved:


If you decide to save ADF Query changes, there is nice dialog to provide customization name, set it to be activated by default and executed automatically without pressing Search button again:


Saved search personalization is available in the list and can be selected any time:


Personalization is saved and available after logout/login - it renders added field and remembers entered criteria value:


Saved search personalization can be easily removed through ADF Query dialog:


This functionality is very important for the users, as it allows to perform search faster without entering parameters again and again. It improves application performance, as search is executed with parameters from the start, limiting results list.

Enabling UI Shell 12c/11g Multitasking Behavior

$
0
0
The main goal of this post is to describe how to enable multitasking functionality in out of the box UI Shell template for ADF applications. Different UI Shell tabs could run different transactions and allow user to commit/rollback data in the scope of individual tab. Every UI Shell tab runs different ADF task flow loaded from the menu.

Here is sample application, enabled with multitasking UI Shell - TFActivationApp.zip. This application is built with single AM module, serving three VO's - Departments, Employees and Jobs. There will be separate ADF task flows, for each of these data objects, loaded in UI Shell tabs. AM module:


There is single data control defined:


Perhaps you would wonder - how this could work in multitasking environment and run multiple transactions, as there is single AM. This would work with ADF isolated task flows, as for each ADF isolated task flow, there is new AM instance created on runtime automatically. As there will be three ADF isolated task flows, one per VO - there will be three (at least three) AM instances created on runtime. One AM instance per ADF isolated task flow:


This is a key part to enable multitasking in UI Shell - make sure to use ADF isolated task flow loaded from the menu.

Firstly, let's run ADF task flows in UI Shell with default - shared mode. Two tabs are opened, modify Department Name in the first task flow:


Go to the second tab and modify First Name:


Press Save in the same tab - Employees. Check the log and you will see that data from both tabs (ADF shared task flows) was commited:


This is not what we want, ideally only data from current tab should be commited. To achieve that, go and set Isolated mode for the ADF task flows (if using ADF task flow template, this can be done centrally):


With isolated mode enabled, when you open new UI Shell tab and load ADF task flow:


New AM instance is created automatically, meaning new transaction is assigned for the ADF task flow loaded in UI Shell tab. We can observe new AM instance creation from the log:


With isolated mode turned on, changes made in UI Shell tabs are commited only for currently active tab:


Even if there are changes made in multiple tabs, changes only from currently active tab are commited. We can observe this from the log - commit is executed for currently active tab AM instance only:


To prove this, we can open second UI Shell tab, where changes were made previously - press Cancel and changes will be reverted:


Multitasking is working, but we need to test another aspect - validation handling. Ideally user should be able to navigate between UI Shell tabs, without being blocked by validation errors. When commiting data in active tab, there should be no validation errors displayed from other UI Shell tab.

Go to the first UI Shell tab - Departments and press Create button to insert new record:


Without typing any data, try to go to the second UI Shell tab - there will be validation errors reported and user will be blocked in the current tab:


In order to resolve this and allow navigation to the different tab, we need to set Immediate = true property for the tab item. This is UI Shell, you could get source code and modify it. This would require change UI Shell library in your project. There is another way - use MDS Seeded Customization to modify out of the box UI Shell, this is what I'm using in this sample application.

Enable MDS Seeded Customization for the project in ADF View section:


Don't forget to define SiteCC class in adf-config.xml, this class handles site level customizations global for all users:


To perform actual MDS Seeded Customization, switch JDeveloper to the Customization mode:


Once JDeveloper restarts in Customization mode, select Show Libraries option for your application, this will list all the libraries attached:


You can go and browse through the library contents, in this example - Oracle Extended Page Templates (it contains UI Shell). Open template page - dynamicTabShell and set Immediate = true for the tab item, this would allow to switch between tabs ignoring validation errors in the current UI Shell tab:


JDeveloper generates new file - MDS Seeded Customization file, this is where customization is stored and applied later on runtime:


Let's repeat the same test - insert new record in Departments tab:


We can navigate to the different UI Shell tab, but there are validation errors rendered from the first tab, when trying to submit data in the current tab - not good:


This can be fixed with a hint from my previous blog post - Skip Validation for ADF Required Tabs. You should set SkipValidation = true in the main page definition, the one which implemented UI Shell page:


It works now - data from second tab can be saved successfully, independently from validation state in the first tab:


Go to the first tab, you can see there are validation messages available. These messages are displayed in the scope of currently opened tab only:


One more good thing, I want to share with you - closing ADF task flows in UI Shell. When tab is closed in UI Shell, ADF task flow is destroyed automatically and finalizer is invoked, where you could release custom resources used by the ADF task flow. Closing tab in UI Shell:


Finalizer is invoked - we can see this from the log:


Finalizer must be defined in the ADF task flow properties:


Finalizer bean is defined in the same ADF task flow, request scope:


Conditional Task Flow Activation in ADF 12c/11g

$
0
0
I will talk about tabs and ADF regions. Depending on the use case, you may want to ensure always only one region is loaded - the one which is displayed. If user opens another tab - region from previously open tab should be destroyed. We can achieve such functionality with combination of conditional region activation and ADF task flow isolated scope. Every time when different tab will be opened - ADF will open referenced region as the first time, data will not be cached.

Feel free to download sample application this post is based on - TabsTFActivation.zip. The key part is to enable conditional activation for every region included into tabs, see here example based on EL:


Every tab must have property listener defined to set tab identifier, the same is used from Page Definition EL to conditionally activate currently displayed region:


Region will be deactivated successfully, only if Isolated data control scope is set. Make sure Isolated scope is set for every region:


Deactivated region custom resources should be cleaned through finalizer, Page Flow Scope is cleaned along with Data Control entry, and AM instance - automatically:


Let's open one of the tabs from the sample for the first time - Employees:


Employees data is fetched from DB initially:


Go to the first tab - Departments and then open Employees tab again:


You will notice Employees data fetched again, as region was created from scratch, as it was destroyed before:


This approach could work as optimization for some of the use cases, when you don't want to hold data for the long time, when user is not working with the region from inactive tab.

Extending WebCenter Portal 11.1.1.8 Made Easy

$
0
0
If you had a chance to extend WebCenter Spaces previous to 11.1.1.8 Portal release, you perhaps would agree - it was not that easy. Good news - with the latest WebCenter Portal 11.1.1.8 release, extending WebCenter Portal process became simpler and more stable. There is special JDeveloper application template provided, designed to extend WebCenter Portal with custom ADF Task Flows and Java code. You can read more about it in WebCenter 11.1.1.8 Developer guide here. Check how complicated it was to extend WebCenter Spaces before - What Else Can Go Wrong when Extending WebCenter Spaces.

Application template name to extend WebCenter Portal - WebCenter Portal Server Extension:


Based on this template, you can create new JDeveloper application, with two projects included - PortalExtension and PortalSharedLibrary. PortalExtension project could include any custom code you deploy. However, most of the time you will attach your custom ADF Libraries to the PortalSharedLibrary project. Simply you would need to add your custom library name into weblogic.xml file:


PortalSharedLibrary project deploys extend.spaces.webapp shared library to the WebCenter Portal server. WebCenter Portal itself is referencing extend.spaces.webapp shared library. This library acts as proxy, it is empty, there is just weblogic.xml file with a list of our custom libraries to be referenced. You should deploy extend.spaces.webapp library to the WebCenter server as a shared library (don't forget to restart it afterwards):


Make sure to increase extend.spaces.webapp library version, before deploying - it must be greater than currently deployed one:


Your custom library should be deployed separately on the server, this library is referenced from proxy extend.spaces.webapp library weblogic.xml. I was deploying custom library as shared library, using sample application from my previous blog post - Deploying ADF Applications as Shared Libraries on WLS:


New version of extend.spaces.webapp library is deployed, referencing redsamurai.shared.lib:


To test newly added custom ADF Task Flow, you need to make sure ADF Security permission is set first, otherwise ADF Task Flow will not be visible. Login to EM and grant security permission, in this example ADF Task Flow for employees is granted with authenticated role:


Once security permission is assigned, custom ADF Task Flow for employees becomes visible from the library catalog and can be added to WebCenter Portal resource catalog:


It can be consumed in the WebCenter Portal page editor (formely - Composer):


Employees data is rendered inside WebCenter Portal through custom ADF Task Flow:


Here you can download - webcenter_extend_ps7.zip example with WebCenter Portal Server Extension application, including custom library reference. Shared library deployment application is packaged into sample application as well, this one is taken from my previous blog post mentioned above.

Major Release for Red Samurai Performance Audit Tool v 2.0

$
0
0
New major release for our Red Samurai Performance Audit Tool is published early this week. Current update contains many new graphs, providing lots of insight statistics for ADF BC performance and user access.

We are using this tool in every project we run, this really helps to optimize ADF BC and overall ADF application performance. Little bit about history of this tool. First release was developed in August, 2012 - Red Samurai Performance Audit Tool - Runtime Diagnosis for ADF Applications. There was update v 1.1 published this year, just before OOW - Red Samurai Performance Audit Tool - OOW 2013 release (v 1.1). Update v 1.1 included such statistical information as overall transactions and queries going through ADF BC, as well as user access.

One of the highlights of current update v 2.0 - use of ADF gauge components to visually display each AM performance. We display slow query performance in seconds for minimum, average and maximum slow query processing times:


There is visual drill down functionality implemented, to provide option to select AM name and review problematic VO instance ordered by issue occurrences or by processing time. This helps to understand directly, which VO is slow and saves time in fixing:

Integrating Custom BPM Worklist into WebCenter Portal (Same Domain for BPM and WebCenter)

$
0
0
I would like to share sample application configured to run custom BPM Worklist and steps describing how to configure and access it from the WebCenter Portal. This post will be based on two other posts from my blog, I would recommend to go through them first. The one where is described how to extend WebCenter Portal 11.1.1.8 - Extending WebCenter Portal 11.1.1.8 Made Easy. Other one about deploying custom ADF shared libraries - Deploying ADF Applications as Shared Libraries on WLS. For this post, I assume BPM and WebCenter environment is running on the same domain. Ah, and there is one more - custom BPM Worklist access implementation through BPM Java API - ADF 11g PS5 Application with Customized BPM Worklist Task Flow (MDS Seeded Customization).

BPM and WebCenter Portal runs on the same domain - each of different WebLogic Managed Server. With such configuration, setup is quite straightforward, comparing to having separate domains:


As there is one domain, it is relatively easy to link WebCenter to BPM. You only need to define Foreign JNDI provider, it will alloow to access Worklist context from WebCenter Portal environment. Make sure BPM libraries are targeted to WebCenter Portal server.

Sample application - BPMWebCenterExtendApp.zip, contains a script to create required Foreign JNDI provider for WebCenter managed server. As BPM and WebCenter runs on the same domain, make sure to disable other properties of the script, except createJNDI:


Script may fail or produce errors, but JNDI link will be be created successfully anyway, most likely. Go to WebLogic console and verify it, you should see similar entry under Foreign JNDI providers:


All required links for this Foreign JNDI provider should be created by the script:


Script may fail to set proper target for created Foreign JNDI provider, you can do this manually. Make sure target points to WebCenter managed server:


Application where ADF task flow is implemented with custom BPM Worklist, imports several required BPM related libraries. You can see a list here:


ADF task flow is fetching list of tasks from BPM engine, tasks assigned to current user. In the next step, it displays assigned tasks in the table:


Sample application provides a method to access BPM Worklist context and fetched tasks assigned to the user:


ADF task flow with custom BPM Worklist access is deployed as ADF Library, this allows to integrate it into WebCenter portal through extending mechanism:


ADF Library will be deployed with provided SharedLib application, deployed as redsamurai.shared.lib library:


This shared library is referencing required BPM libraries, plus our ADF Library with custom ADF task flow for BPM Worklist:


Deploy shared library to WebCenter managed server:


Once shared library is deployed, restart WebCenter managed server and you should be able to add custom BPM Worklist ADF task flow to WebCenter Resource Catalog:


Through WebCenter page editor, we can add our custom ADF task flow from the catalog:


Task flow displays list of tasks assigned in BPM for current portal user:


We can open standard BPM Workspace environment and see same tasks available there:


This example displays list of tasks retrieved from BPM for the current user. Using BPM API we could create new task, process task, map task instance with UI form. This is the beauty of ADF, you are free to implement your own lightweight components to interact with BPM and consume in WebCenter Portal.

Evil Behind ChangeEventPolicy PPR in CRUD ADF 12c and WebLogic Stuck Threads

$
0
0
With this post I'm starting to prepare for our UKOUG'13 conference sessions. You can attend two of our sessions on UKOUG'13 Super Sunday, December 1st. These sessions are scheduled to run immediately one after another, so we are going to have two straight hours to discuss topics around ADF - ADF Anti-Patterns: Dangerous Tutorials, ADF Development Survival Kit. You should stop by to say hi, me and my colleague Florin Marcus, will be happy to answer any technical question about ADF.

Today post topic will be covered in the first session - ADF Anti-Patterns: Dangerous Tutorials. If you know how to implement CRUD functionality, you might be surprised - there are more things to know. I will describe one issue specifically related to ADF 12c, in the next post will present scenario reproduced in ADF 11g.

In your production application, you may experience WebLogic Stuck Threads. This is usually related to the large fetching, in most cases it happens unexpected and is not reproduced easily. In ADF 11g this is related to AM passivation/activation behaviour (will be described more in the next post), in ADF 12c I found another reason for unexpected large fetch in CRUD - ChangeEventPolicy = PPR setting usage for ADF iterator in Page Definition.

Here you can download fixed CRUD application for ADF 12c - LargeFetchApp.zip. This applications provides to methods, to insert 10000 rows into Regions table and remove the same rows. This is important to reproduce the error - it is easily reproduced when there are more records in the DB. Run populateTable method to insert 10000 rows into Regions table:


After running populateTable method, around 10000 rows should be available in the DB:


Regions VO implementation class contains overridden method for createRowFromResultSet. This method tracks every row fetched through this VO and reports it to the log:


Double check Regions iterator properties in Page Definition:


You will see ChangeEventPolicy = ppr, is set by default:


Run test application (make sure ChangeEventPolicy = ppr), press Create button - you will experience 15 to 30 seconds delay, before blank row will show up:


This is easily reproduced, if table is large enough (for example 10000 rows or more), as it was generated with populateTable method. To see ADF log output for Regions VO implementation class, make sure to set FINEST level in ADF logger config:


You will see lots of rows fetched during CreateInsert operation invocation, around 40000 rows fetched (means duplicates, as there are 10000 rows only in the table). This explains why Create is so slow, and this is by default in ADF 12c. It makes so many findByKey method calls:


In production system, when multiple users are doing same operation and even more data in the table - this generates Stuck Threads on WebLogic server and finally application stops. Simply because it consumes too much memory to create huge rowsets for all the data fetched from the DB.

You should change ChangeEventPolicy = none, this prevents unexpected large fetch on CreateInsert:


New row will be inserted instantly:


General recommendation - you should avoid using default setting for ChangeEventPolicy = ppr, it creates too many side effects usually. Seems like this functionality is not well tested yet.

Reproducing WebLogic Stuck Threads with ADF CreateInsert Operation and ORDER BY Clause

$
0
0
This is a second post related to WebLogic Stuck Threads. You can read the first one - Evil Behind ChangeEventPolicy PPR in CRUD ADF 12c and WebLogic Stuck Threads. In the previous post, I was describing how to reproduce WebLogic Stuck Thread in ADF 12c, with ChangeEventPolicy = PPR. As per Steve Muench follow up comment, PPR works well, if you are using auto populated primary key. Today post is about something different, reproducible across all ADF versions - WebLogic Stuck Thread in relation to CreateInsert operation and ORDER BY clause usage for VO.

Here you can download test case application - LargeFetchApp_v2.zip. Make sure to set FINEST ADF Logger level for Regions VO Impl class, this will print fetched rows:


Make sure to disable AM Pooling, this will simulate stress test environment, passivation/activation will be happening on every request:


Region ID is set with default value, this is used to make sure new row gets primary key always set:


We set ORDER BY clause for REGION_ID and use DESC operator. This will bring rows with higher numbers first and display them in the top of ADF table. Make sure to use populateTable method (described in the previous post mentioned above) to insert 10 000 rows into REGIONS table:


Press Create button - new row with ID = 7 will be created, not inserted into DB yet:


Use Save button - to finally insert record into DB:


This is important step, new record will be inserted as last record in DB table, however UI displays it in first position. There is ORDER BY DESC set for this VO.

Try to navigate to other tab or do any other request, after record was saved to DB:


You will see in the log 10 000 rows fetched, until row with ID = 7 is found:


This large fetch is so unexpected and causes WebLogic Stuck Threads when multiple users are using application. The reason it fetches suddenly so many records, because new record with ID = 7 is displayed as the first row, but really this row is not retrieved with first range size of 25 top rows, it comes much later by ORDER BY DESC clause logic.

If we would insert new record with a key 10 100, large fetch would not be reproduced, as this row would be qualified to come with first range size of 25 rows.

We change to ASC order and test it again. Remove newly inserted record with ID = 7 from DB:


ORDER BY clause is updated with ASC order, instead of previous DESC - records with lower ID number will appear first:


Repeat the same steps as before, create new row - ID for the key attribute will be set automatically and equal 7:


Save and insert new row to the DB:


New row with ID = 7 is displayed on top, as it was just inserted, but still it will made into range size of the first 25 rows. We can see it from the log, it will 5th row fetched for the first range size:


As new row will be located early in the first range size, there will be no large fetch happening - application will continue to run fast:


I will describe in my next post, how to implement effective CRUD in ADF, taking into account large fetches. Generally this behaviour should be improved in ADF, instead of fetching entire collection of rows to locate needed row - it could do single fetch by key for a row not located in the first range size.

Reusing and Extending ADF BC Entities from Common Model

$
0
0
This post is about ADF architecture and better application structuring with EO reuse from common model. I describe how to implement additional requirements to common model in extended ADF BC Entities. Great power of ADF framework - reusability. You should reuse as much as possible, this would simplify maintenance and future development of your application. I will be talking about ADF BC Entity Objects (EOs) reuse in this post. I would recommend to keep EOs in common model project and reuse them across the application. Fair requirement would be to have slightly different EO for specific use case - instead of creating new EO for the same DB table, we could extend original EO and implement specific changes. As for example, we may have different set of business rules, different doDML logic.

Sample application developed for this post includes Common Model library and Main application - eoreuse.zip. Common model library is based on Employees and Jobs EOs, associations and EO implementation generic class:


Employees EO from common model library implements business rule for Hire Date attribute - date must be today or in the past. The idea is to show that business rules from common EO will be inherited by extended EO as well:


Generic EO implementation class overrides doDML method, I do this to show the sequence of doDML calls from extended EO:


doDML method from Employees EO is overriden in the same way as in generic EO class:


We switch now to the main application - where common model library is imported successfully. Employees and Jobs EO's are available in the main application:


Here we can create new EO - extending Employees EO from common model:


In order to be able to change properties of specific attribute, we need to override it. Salary attribute is overriden and I have defined additional business rule - salary value check:


As we have extended main Employees EO, we need to define discriminator attribute. This is a key part, otherwise main Employees EO will stop working. As there is no really discriminator attribute in Employees, I'm going to create new transient attribute with default value 0:


Discriminator attribute is included into extended Employees EO, this attribute is set to be hidden - it will be never displayed on UI:


I have defined one additional association in main application - between Employees extended and Jobs EOs. We can assume, this would be required by specific use case:


Employees VO is created inside main application, calculated discriminator attribute is set to be populated from query expression - always returning 0:


This is how value is returned for discriminator attribute from SQL query:


The same value 0 is set for discriminator in VO based on extended Employees EO, it is calculated from SQL query expression:


Query for this VO is the same as for VO based on main Employees EO:


There are two View Links defined for Employees VO based on extended EO. First View Link is reusing Association from common model between employee and manager, this works perfectly for extended EO. Extended EO can reuse Association defined for parent EO:


Second View Link is using Association defined in main application - one between extended Employees EO and Jobs from common model:


Here you can see final ADF BC Data Model structure exposed. Master-Detail relationship between Jobs and Employees from common model, Master-Detail relationship between Employees VO based on extended EO and Jobs from common model, plus employee - manager relationship reusing Association from common model:


Sample application provides ADF UI implementation, where two tabs are given. First tab displays Employees VO structure based on main Employees EO. Second tab displays Employees VO structure based on extended Employees EO.

In the first tab - in the Employees list based on selected Job, business rule common model for Hire Date is working:


In the second tab, Hire Date business rule is working:


Plus additional business rule for Salary attribute, from extended EO is working as well:


Same rule is not reproduced for Employees data in VO based on main EO - as expected:


Make sure to enable ADF logger for application classes, we should check the sequence of doDML calls:


In the case of VO based on main Employees EO - firstly doDML from Employees EO is called and then from generic Entity Impl class:


Now, if you don't want to call doDML from main Employees EO, when calling doDML for extended Employees EO - make sure to change extending class:


As expected - doDML from extended EO is called and then doDML from generic Entity Impl class:

WebLogic Stuck Thread Case - Large Fetch Generated by Get Row ADF BC Method

$
0
0
This post is not about a bug, but rather about hidden underwater stone to avoid. Based on my previous use case for WebLogic Stuck Thread - Reproducing WebLogic Stuck Threads with ADF CreateInsert Operation and ORDER BY Clause, I will describe one more possible scenario for the same issue. This will be related to ADF BC API misuse, often it is unclear what side effect could produce at first friendly looking method. This method - getRow(key).

Please download complete sample application, if you are interested to reproduce it in your environment - LargeFetchApp_v3.zip.

This sample provides a method to generate dummy data for regions, around 10000 rows. View Object is using ORDER BY to display records in ascending order:


There is custom method created in AM implementation class. This method is calling ADF BC API getRow(key) method. Nothing dangerous at first, but here I'm using a key of the last record from the rowset - 10099. You may think, this is just a method to get a row by key.  Yes true, but what it does for you - before returning one row by key, it will fetch all rows until this row into memory. It travels through each row one by one, until it gets a row with defined key. This may consume a lot of memory, especially if rowset is large and row with defined key is somewhere at the end of the rowset. Example of such method:


Make sure ADF Logger config is enabled for RegionsViewImpl class:


Press Get Row button to invoke our custom method from above:


You will see - all rows are fetched and loaded into memory, before desired row is located:


If there will be concurrent users executing the same operation, sooner or later there will be Stuck Thread created in WebLogic and application will hang.

By the way, same applies for Last button use case in ADF, you should never use Last button - operation Last is going to fetch all rows in between, until it will get to the last row in the rowset.

Find By Key and View Criteria Row Finder Methods vs. Get Row Method in ADF BC

$
0
0
You may have seen my previous blog post - WebLogic Stuck Thread Case - Large Fetch Generated by Get Row ADF BC Method. Blog reader was asking if it would be a solution to use instead of Get Row method, alternative methods - Find By Key, or View Criteria. Yes, these two methods perform much better comparing to Get Row, and there is no full intermediate row range scan.

I would like to present updated sample application, where these two methods mentioned above are implemented and tested - LargeFetchApp_v4.zip. Application module implementation class contains two methods, one for Find By Key (intentionally, I'm using key to be at the end of row set) and another for Row Finder based on View Criteria. If I would use View Criteria directly, rowset will become filtered, instead I'm using 12c feature Row Finder, it allows to use View Criteria to search for rows in the current rowset:


I'm using JDEV 12c, this is how Row Finder is defined (you can read more about Row Finder in my previous post - ADF BC 12c New Feature - Row Finder):


Row Finder is using View Criteria, search is done by the same key as in Find By Key method:


We load ADF table and press Find By Key method:


Instead of fetching all rows up to a row with a key 10099 (as it happens with Get Row method), new SQL query is executed and one row is fetched - as we need:


Press Find By Criteria to test second approach with Row Finder and View Criteria:


View Criteria calls new SQL query to search by ID and returns only one row, instead of fetching all rows:


This shows - both method are good enough and perform better than Get Row method. You should choose one of these methods, based on use case requirements.

Smart Declarative Mode Support in ADF BC View Object Join

$
0
0
Declarative mode is known feature of ADF BC and promoted by Steve Muench back in 2008 - Declarative Data Filtering. Declarative mode allows to construct SQL statement on runtime dynamically, based on displayed attributes and ADF bindings in page definition. This is specifically useful for systems created with ADF, where tables are generic and contain long list of attributes. Instead of loading all attributes from DB, it makes perfect sense to load only required ones. There is one more cool feature of declarative mode - it knows how control SQL join on View Object level. Meaning - if attributes coming from joined EO are not rendered, ADF BC will update SQL statement and remove such join. Runtime control for joins is really important, as it may give real additional performance to the system.

Here you can see, what I'm talking about - Jobs EO is joined into Employees VO to provide additional attributes:


Empoyees VO is configured with Declarative mode - SQL statement is hidden, as it will be constructed dynamically on runtime:


Here is important hint - as you might be confused why Declarative mode doesn't work. You must go and set Selected in Query = false for each attribute in the VO. It seems to me like a JDEV bug, this should happen automatically. In this example, Selected in Query = false is set for Email attribute and all others:


There are two attributes added from joined Jobs EO - JobTitle and JobId. These attributes are set with Selected in Query = false as well:


There are two ADF task flows created for test purposes. Both are set with Isolate mode, in order to maintain separate instances of Employees VO. I will be using different sets of attributes in both of these task flows, to demonstrate that Declarative mode really works:


First ADF task flow brings table with 6 attributes. You should pay attention - there are no JOBS related attributes coming from the join:


Make sure ADF Logger is set to FINEST level for oracle.jbo package, this would allow to see a log output from ADF BC and spot executed SQL statement:


Table from above generates SQL statement for 6 attributes, as expected. There is one more great thing - SQL join with JOBS table is skipped, as there are no attributes displayed from that join:


Go to the second ADF task flow, here we are displaying 3 attributes only. One of these attributes - JobTitle, coming from the SQL join:


As it would be expected - SQL join is constructed in this case and 3 attributes are included into SQL query. This would result in less traffic between DB and the server, comparing to the default case when we would fetch all attribute values:


Download sample application - DeclarativeModeApp.zip.

ADF Controller Save Points - Save For Later Implementation with PS_TXN

$
0
0
This blog post will be based on ADF Developer Guide section - 24.7 Using Save Points in Task Flows. I will be describing out of the box 'save for later' functionality provided by ADF BC (Model) and ADF Task Flows (Controller). This will be especially useful for complex forms, where users would prefer to save their work in progress and come back to it later, to finish with the remaining details. We could split transaction and allow to complete that transaction later, even during next login. Such functionality can be achieved with ADF out of the box feature - Save Points (see link to the documentation provided above). In the background Save Point functionality is based on passivation/activation process handled in PS_TXN table. Save point is creating labels and associating these labels with temporary data stored in PS_TXN.

Sample application - ADFTaskFlowSavePointApp.zip, is enabled with Save Points. To enable Save Points support, is enough to provide Data Source for the Save Points:


ADF application should create required table for Save Points automatically, same as PS_TXN table. Table for Save Points - ORADFCSAVPT. This table keeps information about current Save Points for every user:


ADF Task Flow, where we would enable Save Points is same as any other ADF Task Flow. There is only one extra activity created - Save Point Restore. This activity takes Save Point ID and restores it:


Save Point Restore activity takes Save Point ID to restore through property:


Looking from UI perspective, Save Point management can be implemented with options to create Save Point, display a list of previously created Save Points and restore one of the selected Save Points:


Create Save Point method is using Save Point Manager API to create new Save Point:


The same Save Point Manager is used to remove previously created Save Point. In order to restore Save Point, we need to provide Save Point ID, this is retrieved from the list:


List with Save Points is populated through Save Point Manager API and is exposed through Data Control:


We can test it now, sample application is enabled with ADF Security and we can login with redsam user:


Navigate to the next row and change Hire Date value - create new Save Point by providing its name and pressing Create Save Point button - newly created Save Point will appear in the list:


Login with another user - scott:


Save Point API is aware about ADF Security Context and it knows there are no yet Save Points created for scott user:


Let's create one after changing Salary value:


And one more after changing Hire Date value, now both Salary and Hire Date will be added to Save Point:


We can see in the DB table, there are three Save Points created. One for redsam user and two for scott - names are taken from ADF Security by Save Point API automatically:


Here is the prove for my earlier statement - Save Points are working based on PS_TXN passivation/activation. When Save Point is created - passivation happens. Later during Save Point restore, it activates data based on Save Point label from PS_TXN table:


You can get similar log displayed, by setting FINEST ADF log level for oracle.jbo package:


Let's login again with redsam user and try to restore previously created Save Point:


After login, data from DB is displayed as it should be:


Choose to restore previously created Save Point - Pending hire date change. This will reset current row and apply edited data for Hire Date:


User can continue with data change and transaction.

Login with the second user - scott:


Data is loaded from DB, there are two Save Points available in the list:


Select first one - to reset current row and bring edited Salary value:


Select second one - both Salary and Hire Date will be reset to edited values:


I can see ADF Controller Save Points very useful in combination with ADF BC for complex forms implementation, where we need to provide option to split or delay transaction. This is possible because of ADF BC - another plus for ADF BC in ADF.
Viewing all 685 articles
Browse latest View live