Somebody got to my blog searching for how to find an external file. I thought about why they might be looking for that. Since the lack of a file raises an error stack, I concluded that they want to verify whether the file exists before querying it. That way, they avoid the annoying
KUP-04040 error stack, as shown:
There are a number of ways to solve this type of problem but they all employ Java to read the external file system. Rather than write a custom Java module to look for a specific file, I’ll reuse the ListVirtualDirectory library from the Reading an external directory from SQL or PL/SQL blog page. I’m also using the
GET_DIRECTORY_PATH function from this blog page.
These are the steps to perform this small case. You can find broader information on the related blog pages for some elements of these steps.
1. The first thing to do is create a physical directory on the server, than a virtual directory in the database that points to the physical directory. This command creates a virtual
DOWNLOAD directory for uploads when run as the
SQL> CREATE DIRECTORY download AS ‘C:\Data\Files\Upload’;
DOWNLOAD for the virtual directory and Upload for the physical directory should help avoid confusion between virtual and physical directories. After you create a directory, you need to grant privileges to the schema that will read or write to the virtual directory (Oracle also needs file system level privileges to the directory). You grant read privileges by:
SQL> GRANT READ ON DIRECTORY download TO plsql;
2. After creating the directory and granting permissions, you create an external table by using the following type of construction statement:
The metadata or catalog information is stored in two views, the
USER_EXTERNAL_LOCATIONS (they naturally have
DBA_ views too). The virtual directory is stored in the
USER_EXTERNAL_TABLES view, while the physical file name is stored in the
Note: At this point, you should be able to query the contents of the external table. Queries return an error when no file exists or when there aren’t adequate permissions to read the file. Permissions have two levels. One level is the file system, which requires that the Oracle account can read or write files in a physical directory. Another level is the database, which requires the
SYSTEM user grant read or write permission on the virtual directory.
3.This step requires you to follow the instructions from the Read an external directory from SQL or PL/SQL blog page. You should use the first (simplest) Java program and change the physical directory to your target directory when you call the
DMBS_JAVA.GRANT_PERMISSION procedure. The test script in that blog should return your character.csv file as a file in the directory.
4.The next step requires opening up the access to the physical paths that support virtual directories. These are part of the catalog and not available without you making some changes in the catalog. The substeps that walk you through the changes are:
4(a). Connect as the
SYS user and grant
SELECT privilege on the catalog view:
SQL> GRANT SELECT ON dba_directories TO plsql;
4(b). Connect as the
SYSTEM user and create the
CREATE OR REPLACE FUNCTION get_directory_path
( virtual_directory IN VARCHAR2 )
RETURN VARCHAR2 IS
– Define return variable.
directory_path VARCHAR2(256) := ”;
– Define dynamic cursor.
CURSOR get_directory (virtual_directory VARCHAR2) IS
WHERE directory_name = virtual_directory;
– Define a local exception for name violation.
OPEN get_directory (virtual_directory);
– Return file name.
WHEN directory_name THEN
4(c). You now grant
EXECUTE on the get_directory_path function to the
SQL> GRANT EXECUTE ON get_directory_path TO plsql;
4(d). You should connect back as the
PLSQL user by:
SQL> CONNECT plsql/plsql
5. The next step involves some tricky SQL that lets you pass a list of variables into a function called within a
TABLE function. The SQL statement returns a list of all tables and their file names after verifying whether the physical files exist. They’re illustrated in the next annotated screen shot:
The first trick in this query is the
CROSS JOIN between the
TABLE function call. You must list the table that provides the column value as a parameter to the
GET_DIRECTORY_PATH function first. If you list it second, the query will fail. The reason is that when it appears first, it places the column values in scope to the TABLE function. The
TABLE function converts a SQL scalar collection into an aggregate table of one column, which can be used inside SQL statements. The return column name for any call to the
TABLE function is
COLUMN_VALUE. The second trick is that we create an alias for the
COLUMN_VALUE because of the restrictions that governs its use. The last (third) trick isn’t really a trick, it’s a join between the inline view aliased as
XT and the table aliased as
6. Lastly, you need to right a function that modifies the query so that it is parameter driven. The following depicts a function that returns a
BOOLEAN value of true when the file is present, and false when it isn’t. This should allow you to check whether the external file is present before querying.
The solution has demonstrated how you can avoid the annoying
KUP-04040 error stack in your code that queries external tables. Hope it is useful. If you find any problems, please post a comment to let me know.