From cbf755bd65317f6a0f84fa13a5404eb02d0c9a6d Mon Sep 17 00:00:00 2001 From: Remington Steed Date: Fri, 6 Jun 2014 11:59:58 -0400 Subject: [PATCH] LP#968514: Revive bib/auth importing docs from 2.1, add intro This commit revives the following sections from the 2.1 version of the docs and updates them for 2.6: - marc_export: Exporting Bibliographic Records into MARC files - includes new options for 2.6 - Importing Authority Records from Command Line - Importing Authority Records from the Staff Client The first two are included in a new section called "Support Scripts" within the "Developer Resources" section, with a new intro and summary of other commonly used scripts provided with Evergreen. The third section is moved inside the Cataloging chapter "Batch Importing MARC Records", with slight modifications to the intro to that chapter. Signed-off-by: Remington Steed --- docs/cataloging/batch_importing_MARC.txt | 46 +++++- docs/development/support_scripts.txt | 173 +++++++++++++++++++++++ docs/root.txt | 2 + 3 files changed, 218 insertions(+), 3 deletions(-) create mode 100644 docs/development/support_scripts.txt diff --git a/docs/cataloging/batch_importing_MARC.txt b/docs/cataloging/batch_importing_MARC.txt index 0b63150d1c..4cfa37ee81 100644 --- a/docs/cataloging/batch_importing_MARC.txt +++ b/docs/cataloging/batch_importing_MARC.txt @@ -2,13 +2,13 @@ Batch Importing MARC Records ---------------------------- [[batchimport]] The cataloging module includes an enhanced MARC Batch Import interface for -loading MARC records. This interface allows you to specify match points +loading MARC records. In general, it can handle batches up to 5,000 records +without a problem. This interface allows you to specify match points between incoming and existing records, to specify MARC fields that should be overlaid or preserved, and to only overlay records if the incoming record is of higher quality than the existing record. Records are added to a queue where you can apply filters that enable you to generate any errors that may have -occurred during import. You can print your queue, email your queue, or export -your queue as a CSV file. +occurred during import. You can print, email or export your queue as a CSV file. Permissions ~~~~~~~~~~~ @@ -309,3 +309,43 @@ The following *Library Settings* can be configured to apply these default values * *Vandelay: Default Circulation Modifier* —Default circulation modifier value for imported items +Importing Authority Records from the Staff Client +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +For an alternate method of importing authority records, read +<<_importing_authority_records_from_command_line,Importing Authority +Records from Command Line>>. + +To import a set of MARC authority records from the _MARC Batch +Import/Export_ interface: + + . From the Evergreen staff client, select *Cataloging -> MARC Batch +Import/Export*. The _Evergreen MARC File Upload_ screen opens, with +_Import Records_ as the highlighted tab. + + . From the *Record Type* drop-down menu, select *Authority Records*. + + . Enter a name for the queue (batch import job) in the *Create a New +Upload Queue* field. + + . Select the *Import Non-Matching Records* checkbox. + + . Click the *Browse* button to select the file of MARC authorities to import. + + . Click the *Upload* button to begin importing the records. ++ +The screen displays "Uploading... Processing..." to show that the records +are being transferred to the server, then displays a progress bar to show +the actual import progress. When the staff client displays the progress +bar, you can disconnect your staff client safely. Very large batches of +records might time out at this stage. + + . Once the import is finished, the staff client displays the results of +the import process. You can manually display the import progress by +selecting the _Inspect Queue_ tab of the _MARC Batch Import/Export_ +interface and selecting the queue name. By default, the staff client does +not display records that were imported successfully; it only shows records +that conflicted with existing entries in the database. The screen shows +the overall status of the import process in the top right-hand corner, +with the Total and Imported number of records for the queue. + diff --git a/docs/development/support_scripts.txt b/docs/development/support_scripts.txt new file mode 100644 index 0000000000..229d7d2e0b --- /dev/null +++ b/docs/development/support_scripts.txt @@ -0,0 +1,173 @@ +== Support Scripts + +Various scripts are included with Evergreen in the `/openils/bin/` directory +(and in the source code in `Open-ILS/src/support-scripts` and +`Open-ILS/src/extras`). Some of them are used during +the installation process, such as `eg_db_config`, while others are usually +run as cron jobs for routine maintenance, such as `fine_generator.pl` and +`hold_targeter.pl`. Others are useful for less frequent needs, such as the +scripts for importing/exporting MARC records. You may explore these scripts +and adapt them for your local needs. You are also welcome to share your +improvements or ask any questions on the +http://evergreen-ils.org/communicate/[Evergreen IRC channel or email lists]. + +Here is a summary of the most commonly used scripts. The script name links +to more thorough documentation, if available. + + * <<_processing_action_triggers,action_trigger_runner.pl>> + -- Useful for creating events for specified hooks and running pending events + * authority_authority_linker.pl + -- Links reference headings in authority records to main entry headings + in other authority records. Should be run at least once a day (only for + changed records). + * authority_control_fields.pl + -- Links bibliographic records to the best matching authority record. + Should be run at least once a day (only for changed records). + * autogen.sh + -- Generates web files used by the OPAC, especially files related to + organization unit hierarchy, fieldmapper IDL, locales selection, + facet definitions, compressed JS files and related cache key + * clark-kent.pl + -- Used to start and stop the reporter (which runs scheduled reports) + * <<_creating_the_evergreen_database,eg_db_config>> + -- Creates database and schema, updates config files, sets Evergreen + administrator username and password + * fine_generator.pl + * hold_targeter.pl + * <<_importing_authority_records_from_command_line,marc2are.pl>> + -- Converts authority records from MARC format to Evergreen objects + suitable for importing via pg_loader.pl (or parallel_pg_loader.pl) + * marc2bre.pl + -- Converts bibliographic records from MARC format to Evergreen objects + suitable for importing via pg_loader.pl (or parallel_pg_loader.pl) + * marc2sre.pl + -- Converts serial records from MARC format to Evergreen objects + suitable for importing via pg_loader.pl (or parallel_pg_loader.pl) + * <<_marc_export,marc_export>> + -- Exports authority, bibliographic, and serial holdings records into + any of these formats: USMARC, UNIMARC, XML, BRE, ARE + * osrf_control + -- Used to start, stop and send signals to OpenSRF services + * parallel_pg_loader.pl + -- Uses the output of marc2bre.pl (or similar tools) to generate the SQL + for importing records into Evergreen in a parallel fashion + + +anchor:_marc_export[] + +=== marc_export: Exporting Bibliographic Records into MARC files + +indexterm:[marc_export] + +The following procedure explains how to export Evergreen bibliographic +records into MARC files using the *marc_export* support script. All steps +should be performed by the `opensrf` user from your Evergreen server. + +[NOTE] +Processing time for exporting records depends on several factors such as +the number of records you are exporting. It is recommended that you divide +the export ID files (records.txt) into a manageable number of records if +you are exporting a large number of records. + + . Create a text file list of the Bibliographic record IDs you would like +to export from Evergreen. One way to do this is using SQL: ++ +[source,sql] +---- +SELECT DISTINCT bre.id FROM biblio.record_entry AS bre + JOIN asset.call_number AS acn ON acn.record = bre.id + WHERE bre.deleted='false' and owning_lib=101 \g /home/opensrf/records.txt; +---- ++ +This query creates a file called `records.txt` containing a column of +distinct IDs of items owned by the organizational unit with the id 101. + + . Navigate to the support-scripts folder ++ +---- +cd /home/opensrf/Evergreen-ILS*/Open-ILS/src/support-scripts/ +---- + + . Run *marc_export*, using the ID file you created in step 1 to define which + files to export. The following example exports the records into MARCXML format. ++ +---- +cat /home/opensrf/records.txt | ./marc_export --store -i -c /openils/conf/opensrf_core.xml \ + -x /openils/conf/fm_IDL.xml -f XML --timeout 5 > exported_files.xml +---- + +[NOTE] +==================== +`marc_export` was updated in Evergreen 2.6 and now does not output progress +as it executes. +==================== + +[NOTE] +==================== +You can use the `--since` option to export records modified after a +certain date and time. +==================== + +[NOTE] +==================== +By default, marc_export will use the reporter storage service, which should +work in most cases. But if you have a separate reporter database and you +know you want to talk directly to your main production database, then you +can set the `--store` option to `cstore` or `storage`. +==================== + +[NOTE] +==================== +For more information, run marc_export with the -h option: + + ./marc_export -h +==================== + + + +=== Importing Authority Records from Command Line + +indexterm:[marc2are.pl] +indexterm:[pg_loader.pl] + +The major advantages of the command line approach are its speed and its +convenience for system administrators who can perform bulk loads of +authority records in a controlled environment. For alternate instructions, +see <<_importing_authority_records_from_the_staff_client,Importing +Authority Records from the Staff Client>>. + + . Run *marc2are.pl* against the authority records, specifying the user +name, password, MARC type (USMARC or XML). Use `STDOUT` redirection to +either pipe the output directly into the next command or into an output +file for inspection. For example, to process a file with authority records +in MARCXML format named `auth_small.xml` using the default user name and +password, and directing the output into a file named `auth.are`: ++ +---- +cd Open-ILS/src/extras/import/ +perl marc2are.pl --user admin --pass open-ils --marctype XML auth_small.xml > auth.are +---- ++ +[NOTE] +The MARC type will default to USMARC if the `--marctype` option is not specified. + + . Run *pg_loader.pl* to generate the SQL necessary for importing the +authority records into your system. To save time for very large batches +of records, you could simply pipe the output of *marc2are.pl* directly +into *pg_loader.pl*. ++ +---- +cd Open-ILS/src/extras/import/ +perl pg_loader.pl --auto are --order are auth.are > auth_load.sql +---- + + . Load the authority records from the SQL file that you generated in the +last step into your Evergreen database using the psql tool. Assuming the +default user name, host name, and database name for an Evergreen instance, +that command looks like: ++ +---- +psql -U evergreen -h localhost -d evergreen -f auth_load.sql +---- + + diff --git a/docs/root.txt b/docs/root.txt index b8c3d29ff3..27415170b1 100644 --- a/docs/root.txt +++ b/docs/root.txt @@ -410,6 +410,8 @@ Introduction Developers can use this part to learn more about the programming languages, communication protocols and standards used in Evergreen. +include::development/support_scripts.txt[] + // Push titles down one level. :leveloffset: 1 -- 2.43.2