]> git.evergreen-ils.org Git - working/Evergreen.git/blob - 1.6/admin/migratingdata.xml
Add index terms. and remove files not being used.
[working/Evergreen.git] / 1.6 / admin / migratingdata.xml
1 <?xml version='1.0' encoding='UTF-8'?>\r
2 <chapter xmlns="http://docbook.org/ns/docbook" xmlns:xi="http://www.w3.org/2001/XInclude"\r
3             xmlns:xl="http://www.w3.org/1999/xlink" version="5.0" xml:id="migratingdata" >\r
4         <info>          \r
5         <title>Migrating Data</title>\r
6                 <abstract>\r
7                         <para>Migrating data into Evergreen can be one of the most daunting tasks for an administrator. This chapter will explain some procedures to help to migrate \r
8                         bibliographic records, copies and patrons into the Evergreen system. This chapter requires advanced ILS Administration experience, knowledge of Evergreen data structures, \r
9                         as well as knowledge of how to export data from your current system or access to data export files from your current system.</para>\r
10                 </abstract>\r
11         </info>    \r
12         <section xml:id="migratingbibrecords">\r
13                 <title>Migrating Bibliographic Records</title>\r
14                 <indexterm><primary>migrating</primary><secondary>importing bibliographic records</secondary></indexterm>\r
15                 <para>\r
16                 One of the most important and challenging  tasks is migrating your bibliographic records to a new system. The procedure may be different depending on the system from which you \r
17                 are migrating  and the content of the marc records exported from the existing system. The proecedures in this section deal with the process once the data from the existing system \r
18                 is exporterd into marc records. It does not cover exporting data from your existing non-Evergreen system.</para>\r
19                 <para>Several tools for importing bibliographic records into Evergreen can be found in the Evergreen installation folder \r
20                 (<filename class="directory">/home/opensrf/Evergreen-ILS-1.6.1.2/Open-ILS/src/extras/import/</filename> ) and are also available from the Evergreen repository \r
21                 (<link xl:href="http://svn.open-ils.org/trac/ILS/browser/branches/rel_1_6_1/Open-ILS/src/extras/import" xl:title="import scripts - Evergreen repository">\r
22                 http://svn.open-ils.org/trac/ILS/browser/branches/rel_1_6_1/Open-ILS/src/extras/import</link>).</para> \r
23                 <simplesect>\r
24                         <title>Converting MARC records to Evergreen BRE JSON format</title>\r
25                         <indexterm><primary>BRE JSON</primary></indexterm>\r
26                         <para>If you are starting with MARC records from your existing system or another source, use the marc2bre.pl script to create the JSON representation of a bibliographic \r
27                         record entry (hence bre) in Evergreen. <filename>marc2bre.pl</filename> can perform the following functions:</para>\r
28                         <itemizedlist> \r
29                                 <listitem><para>Converts <systemitem>MARC-8</systemitem> encoded records to <systemitem>UTF-8</systemitem> encoding</para></listitem>\r
30                                 <listitem><para>Converts <systemitem>MARC21</systemitem> to <systemitem>MARCXML21</systemitem></para></listitem>\r
31                                 <listitem><para>Select the unique record number field (common choices are '035' or '001'; check your records as you might be surprised how a supposedly unique field \r
32                                 actually has duplicates, though marc2bre.pl will select a unique identifier for subsequent duplicates)</para></listitem>\r
33                                 <listitem><para>Extracts certain pertinent fields indexing and display purposes (along with the complete MARCXML21 record)</para></listitem>\r
34                                 <listitem><para>Sets the ID number of the first record from this batch to be imported into the biblio.record_entry table (hint - run the following \r
35                                 <systemitem>SQL</systemitem> to determine what this number should be to avoid conflicts:</para>\r
36 <screen>\r
37 <userinput>psql -U postgres evergreen</userinput>\r
38 <userinput> # SELECT MAX(id)+1 FROM biblio.record_entry;</userinput>\r
39 </screen>\r
40                         </listitem>\r
41                         <listitem>\r
42                                 <para>If you are processing multiple sets of MARC records with <command>marc2bre.pl</command>before loading the records into the database, you will need to keep track \r
43                                 of the starting ID number for each subsequent batch of records that you are importing. For example, if you are processing three files of MARC records with 10000 \r
44                                 records each into a clean database, you would use <option>–startid 1</option>, <option>–startid 10001</option>, and <option>–startid 20001</option> \r
45                                 parameters for each respective file.</para>\r
46                         </listitem>\r
47                         <listitem>\r
48                                 <para>Ignore <quote>trash</quote> fields that you do not want to retain in Evergreen</para>\r
49                         </listitem>\r
50                         <listitem>\r
51                                 <para>If you use <filename>marc2bre.pl</filename> to convert your MARC records from the <systemitem>MARC-8</systemitem> encoding to the UTF-8 encoding, it relies\r
52                                  on the <systemitem>MARC::Charset Perl</systemitem> module to complete the conversion. When importing a large set of items, you can speed up the process by using a \r
53                                 utility like <systemitem>marc4j</systemitem> or <systemitem>marcdumper</systemitem> to convert the records \r
54                                 to <systemitem>MARC21XML</systemitem> and <systemitem>UTF-8</systemitem> before running them through <command>marc2bre.pl</command> with the \r
55                                 <option>–marctype=XML</option> flag to tell <command>marc2bre.pl</command> that the records are already in <systemitem>MARC21XML</systemitem> format with \r
56                                 the <systemitem>UTF-8 </systemitem>encoding. If you take this approach, due to a current limitation of <systemitem>MARC::File::XML</systemitem> you have to do a \r
57                                 horrible thing and ensure that there are no namespace prefixes in front of the element names. <filename>marc2bre.pl</filename> cannot parse the following \r
58                                 example:</para>\r
59 \r
60 <programlisting language="xml">\r
61 <![CDATA[\r
62 \r
63 <?xml version="1.0" encoding="UTF-8" ?>\r
64 <marc:collection xmlns:marc="http://www.loc.gov/MARC21/slim" \r
65   xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" \r
66   xsi:schemaLocation="http://www.loc.gov/MARC/slim http://www.loc.gov/standards/marcxml/schema/MARC21slim.xsd">\r
67   <marc:record>\r
68     <marc:leader>00677nam a2200193 a 4500</marc:leader>\r
69     <marc:controlfield tag="001">H01-0000844</marc:controlfield>\r
70     <marc:controlfield tag="007">t </marc:controlfield>\r
71     <marc:controlfield tag="008">060420s1950    xx            000 u fre d</marc:controlfield>\r
72     <marc:datafield tag="040" ind1=" " ind2=" ">\r
73       <marc:subfield code="a">CaOHCU</marc:subfield>\r
74       <marc:subfield code="b">fre</marc:subfield>\r
75     </marc:datafield>\r
76 ...\r
77 ]]>;\r
78 </programlisting>\r
79                         \r
80                         <para>But marc2bre.pl can parse the same example with the namespace prefixes removed:</para>\r
81 <programlisting language="xml">\r
82 <![CDATA[\r
83 <?xml version="1.0" encoding="UTF-8" ?>\r
84 <collection xmlns:marc="http://www.loc.gov/MARC21/slim" \r
85   xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" \r
86   xsi:schemaLocation="http://www.loc.gov/MARC/slim http://www.loc.gov/standards/marcxml/schema/MARC21slim.xsd">\r
87   <record>\r
88     <leader>00677nam a2200193 a 4500</leader>\r
89     <controlfield tag="001">H01-0000844</controlfield>\r
90     <controlfield tag="007">t </controlfield>\r
91     <controlfield tag="008">060420s1950    xx            000 u fre d</controlfield>\r
92     <datafield tag="040" ind1=" " ind2=" ">\r
93       <subfield code="a">CaOHCU</subfield>\r
94       <subfield code="b">fre</subfield>\r
95     </datafield>\r
96 ...\r
97 ]]>;\r
98 </programlisting>\r
99                                 </listitem>\r
100                         </itemizedlist>\r
101                 </simplesect>\r
102                 <simplesect>\r
103                         <title>Converting Records for Import into PostgreSQL</title>\r
104                         <indexterm><primary>migrating</primary><secondary>import into PostgreSQL</secondary></indexterm>\r
105                         <para>Once you have your records in Evergreen's <systemitem>BRE JSON</systemitem> format, you then need to use <command>direct_ingest.pl</command> to convert the records \r
106                         into the generic ingest <systemitem>JSON</systemitem> format for Open-ILS. \r
107                         This step uses the <systemitem>open-ils.ingest</systemitem> application to extract the data that will be indexed in the database. </para>\r
108                         <para>Once you have your records in Open-ILS <systemitem>JSON</systemitem> ingest format, you then need to use <command>pg_loader.pl</command> to convert these records into a \r
109                         set of <systemitem>SQL</systemitem> statements that you can use to \r
110                         load the records into PostgreSQL. The <option>–order</option> and <option>–autoprimary</option> command line options (bre, mrd, mfr, etc) map to class IDs defined in \r
111                         <filename>/openils/conf/fm_IDL.xml</filename>.</para>\r
112                 </simplesect>           \r
113                 <simplesect>\r
114                         <title>Adding Metarecords to the Database</title>\r
115                         <indexterm><primary>migrating</primary><secondary>adding metarecords</secondary></indexterm>\r
116                         <para>One you have loaded the records into PostgreSQL, you can create metarecord entries in the <link linkend="metabib.table.metarecord">metabib.metarecord</link> table by                             running the following <systemitem>SQL</systemitem>:</para>\r
117 <screen>\r
118 <userinput>psql evergreen</userinput>\r
119 <userinput># \i /home/opensrf/Evergreen-ILS-1.6*/src/extras/import/quick_metarecord_map.sql</userinput>\r
120 </screen>\r
121                         <para>Metarecords are required to place holds on items, among other actions.</para>\r
122                 </simplesect>   \r
123         </section>\r
124         <section xml:id="migratingbibrecordcopies">\r
125                 <title>Adding Copies to Bibliographic Records</title>\r
126                 <indexterm><primary>migrating</primary><secondary>adding copies</secondary></indexterm> \r
127                 <para>Once you've loaded the bibliographic records in Evergreen, you can search and view the records in the staff client, but they will not be visible in the catalogue. By \r
128                 default, bibliographic records will not be visible in the catalogue until you add a copy representing a physical manifestation of that resource. You can add a copy manually through \r
129                 the staff client via the Holdings maintenance screen, but if you're bulk-importing MARC records you probably want to bulk load the associated copies, call numbers, and barcodes as \r
130                 well.</para>\r
131                 <simplesect>\r
132                         <title>Importing volumes and copies from <systemitem>MARC21XML</systemitem> holdings</title>\r
133                         <indexterm><primary>migrating</primary><secondary>importing volumes</secondary></indexterm>     \r
134                         <para>There is currently no simple method for importing holdings based on the contents of the MARC holdings field (852, as specified by \r
135                         <link xml:href="http://www.loc.gov/marc/holdings/">http://www.loc.gov/marc/holdings/</link>). \r
136                         However, a more or less automated method could be built that performs the following steps:</para>\r
137                         <procedure>\r
138                                 <step><para>Create a tab-delimited file that contains your holdings information</para>\r
139                                 <itemizedlist>\r
140                                         <listitem>Required fields: bibliographic ID, barcode, and call number</listitem>                \r
141                                         <listitem>Optional fields: shelving location (text) – see  the <link linkend="asset.table.copy">asset.copy table</link> for \r
142                                         possible fields to include</listitem>\r
143                                 </itemizedlist>         \r
144                                 </step>\r
145                                 <step><para>Create a staging table that matches the contents of your tab-delimited file.</para>\r
146                                 </step>\r
147                                 <step><para>Insert the contents of your tab-delimited file into the table.</para>\r
148                                 </step>\r
149                                 <step><para>Generate <systemitem>SQL</systemitem> scripts for item import to match the staging table that you created.</para>\r
150                                 </step>\r
151                                 <step><para>Run the <systemitem>SQL</systemitem> scripts to create the holdings in Evergreen.</para>\r
152                                 </step>\r
153                         </procedure>\r
154                         <para>If an ILS has the concept of <quote>item categories</quote>, these may be mapped to Evergreen via statistical categories in the \r
155                         <link linkend="asset.table.stat_cat">asset.stat_cat table</link> . Note that statistical categories cannot be used as search filters; individual branches can define \r
156                         their own statistical categories; and define their own statistical category entries for individual items - best use case for statistical categories is probably for gifts.</para>\r
157                         <para>In 2009, Conifer placed their <link xl:href="http://svn.open-ils.org/trac/ILS-Contrib/browser/conifer/branches/rel_1_6_1/tools/migration-scripts"  \r
158                         xl:title="Conifer migration tools">migration tools</link> \r
159                         in the <link xl:href="http://svn.open-ils.org/trac/ILS-Contrib/browser/conifer" xl:title="Conifer ILS-Contrib SVN repository">Conifer ILS-Contrib SVN repository</link>, which might be useful samples augmenting the \r
160                         basic staging table import approach.</para>\r
161                         <para>In 2010, Equinox contributed a set of <link xl:href="http://git.esilibrary.com/?p=migration-tools.git;a=summary"  xl:title="Equinox migration utilities">migration utilities</link></para>\r
162                 </simplesect>\r
163         </section>\r
164         <section xml:id="migratingpatrons">\r
165                 <title>Migrating Patron Data</title>\r
166                 <indexterm><primary>migrating</primary><secondary>importing patrons</secondary></indexterm>\r
167                 <para>\r
168                 This section will explain the task of migrating your patron data from comma delimited files<indexterm><primary>comma delimited files</primary></indexterm> into Evergreen. \r
169                 It does not deal with the process of exporting from the non-Evergreen \r
170                 system since this process may vary depending on where you are extracting your patron records. Patron could come from an ILS or it could come from a student database in the case of \r
171                 academic records.               \r
172                 </para>\r
173                 <para>When importing records into Evergreen you will need to populate 3 tables in your Evergreen database:</para>\r
174                 <itemizedlist>\r
175                         <listitem><link linkend="actor.table.usr">actor.usr</link> - The main table for user data</listitem>\r
176                         <listitem><link linkend="actor.table.card">actor.card</link> - Stores the barcode for users; Users can have more than 1 card but only 1 can be active at a given time;</listitem>\r
177                         <listitem><link linkend="actor.table.usr-address">actor.usr_address</link> - Used for storing address information; A user can have more than one address.</listitem>\r
178                 </itemizedlist>\r
179                 <para>Before following the procedures below to import patron data into Evergreen, it is a good idea to examine the fields in these tables in order to decide on a strategy \r
180                 for data to include \r
181                 in your import. It is important to understand the data types and constraints on each field.</para>\r
182                 <procedure>\r
183                         <step>\r
184                                 <para>Export the patron data from your existing ILS or from another source into a comma delimited file. The comma delimited file used for importing\r
185                                  the records should use Unicode (UTF8) <indexterm><primary>Unicode</primary></indexterm> character encoding.</para>\r
186                         </step>\r
187                         <step>\r
188                                 <para>Create a staging table.<indexterm><primary>staging table</primary></indexterm>  A staging table will allow you to tweak the data before importing. \r
189                                 Here is an example sql statement:</para>\r
190                                 <indexterm><primary>sql</primary></indexterm> \r
191 <programlisting language="sql">\r
192 CREATE TABLE students (\r
193         student_id int, barcode text, last_name text, first_name text, program_number text, program_name text, email text, address_type text, street1 text, \r
194         street2 text, city text, province text, country text, postal_code text, phone text, profile int, ident_type int, home_ou int, \r
195         claims_returned_count int DEFAULT 0, usrname text, net_access_level int DEFAULT 2, password text\r
196 ); \r
197 </programlisting>\r
198                                 <para>Note the <varname>DEFAULT</varname> variables. These allow you to set default for your library or to populate required fields if you data allows \r
199                                 <systemitem>NULL</systemitem> values where fields are required in Evergreen.</para>\r
200                         </step>\r
201                         <step>\r
202                                 <para>Formatting of some fields to fit Evergreen filed formatting may be required. Here is an example of sql to adjust phone numbers in the staging \r
203                                 table to fit the evergreen field:</para>\r
204 <programlisting language="sql">\r
205 UPDATE students phone = replace(replace(replace(rpad(substring(phone from 1 for 9), 10, '-') || substring(phone from 10), '(', ''), ')', ''), ' ', '-');\r
206 </programlisting>\r
207                                 <para>Data <quote>massaging</quote> may be required to fit formats used in Evergreen.</para>\r
208                         </step>\r
209                         <step>\r
210                                 <para>Insert records from the staging table into the <link linkend="actor.table.usr">actor.usr</link> Evergreen table:</para>\r
211 <programlisting language="sql">\r
212  INSERT INTO actor.usr (\r
213         profile, usrname, email, passwd, ident_type, ident_value, first_given_name, family_name, day_phone, home_ou, claims_returned_count, \r
214         net_access_level) \r
215         SELECT profile, students.usrname, email, student_id, ident_type, student_id, first_name, last_name, phone, home_ou, claims_returned_count, \r
216         net_access_level \r
217         FROM students;\r
218 </programlisting>                       \r
219                         </step>\r
220                         <step>\r
221                                 <para>insert records into <link linkend="actor.table.card">actor.card</link> from <link linkend="actor.table.usr">actor.usr</link>.</para>\r
222 <programlisting language="sql">\r
223 INSERT INTO actor.card (usr, barcode) \r
224         SELECT actor.usr.id, students.barcode \r
225         FROM students \r
226                 INNER JOIN actor.usr \r
227                         ON students.usrname = actor.usr.usrname;\r
228 </programlisting>               \r
229                                 <para>This assumes a one to one card patron relationship. If your patron data import has multiple cards assigned to one patron more complex import scripts may be required                                      which look for inactive or active flags.</para> \r
230                         </step>\r
231                         <step>\r
232                                 <para>Update actor.usr.card field with actor.card.id to associate active card with the user:</para>\r
233 <programlisting language="sql">\r
234 UPDATE actor.usr \r
235         SET card = actor.card.id \r
236         FROM actor.card \r
237         WHERE actor.card.usr = actor.usr.id;\r
238 </programlisting>                       \r
239                         </step>\r
240                         <step>\r
241                                 <para>Insert records into <link linkend="actor.table.usr-address">actor.usr_address</link> to add address information for users:</para>\r
242 <programlisting language="sql">\r
243 INSERT INTO actor.usr_address (usr, street1, street2, city, state, country, post_code) \r
244         SELECT actor.usr.id, students.street1, students.street2, students.city, students.province, students.country, students.postal_code \r
245         FROM students \r
246         INNER JOIN actor.usr ON students.usrname = actor.usr.usrname;\r
247 </programlisting>                       \r
248                         </step>\r
249                         <step>\r
250                                 <para>update <link linkend="actor.table.usr-address">actor.usr.address</link> with address id from address table.</para>\r
251 <programlisting language="sql">\r
252 UPDATE actor.usr \r
253         SET mailing_address = actor.usr_address.id, billing_address = actor.usr_address.id \r
254         FROM actor.usr_address \r
255         WHERE actor.usr.id = actor.usr_address.usr;\r
256 </programlisting>       \r
257                         <para>This assumes 1 address per patron. More complex scenarios may require more sophisticated SQL.</para>              \r
258                         </step>\r
259                 </procedure>\r
260                 <simplesect>\r
261                         <title>Creating an sql Script for Importing Patrons</title>\r
262                         <para>The procedure for importing patron can be automated with the help of an sql script. Follow these steps to create an import script:</para>\r
263                 \r
264                         <procedure>\r
265                                 <step>\r
266                                         <para>Create an new file and name it <filename>import.sql</filename></para>\r
267 \r
268                                 </step>\r
269 \r
270                                 <step>\r
271                                         <para>Edit the file to look similar to this:</para>\r
272 <programlisting>\r
273 BEGIN;\r
274 \r
275 -- Create staging table.\r
276 CREATE TABLE students (\r
277         student_id int, barcode text, last_name text, first_name text, program_number text, program_name text, email text, address_type text, \r
278         street1 text, street2 text, city text, province text, country text, postal_code text, phone text, profile int, ident_type int, home_ou int, \r
279         claims_returned_count int DEFAULT 0, usrname text, net_access_level int DEFAULT 2, password text\r
280 ); \r
281 \r
282 \r
283 --Insert records from the staging table into the actor.usr table.\r
284 INSERT INTO actor.usr (\r
285         profile, usrname, email, passwd, ident_type, ident_value, first_given_name, family_name, day_phone, home_ou, claims_returned_count, \r
286         net_access_level) \r
287         SELECT profile, students.usrname, email, student_id, ident_type, student_id, first_name, last_name, phone, home_ou, claims_returned_count, \r
288         net_access_level FROM students;\r
289 \r
290 --Insert records from the staging table into the actor.usr table.\r
291 INSERT INTO actor.card (usr, barcode) \r
292         SELECT actor.usr.id, students.barcode \r
293         FROM students \r
294                 INNER JOIN actor.usr \r
295                         ON students.usrname = actor.usr.usrname;\r
296 \r
297 --Update actor.usr.card field with actor.card.id to associate active card with the user:\r
298 UPDATE actor.usr \r
299         SET card = actor.card.id \r
300         FROM actor.card \r
301         WHERE actor.card.usr = actor.usr.id;\r
302 \r
303 --INSERT records INTO actor.usr_address from staging table.\r
304 INSERT INTO actor.usr_address (usr, street1, street2, city, state, country, post_code) \r
305         SELECT actor.usr.id, students.street1, students.street2, students.city, students.province, students.country, students.postal_code \r
306         FROM students \r
307         INNER JOIN actor.usr ON students.usrname = actor.usr.usrname;\r
308 \r
309 \r
310 --Update actor.usr mailing address with id from actor.usr_address table.:\r
311 UPDATE actor.usr \r
312         SET mailing_address = actor.usr_address.id, billing_address = actor.usr_address.id \r
313         FROM actor.usr_address \r
314         WHERE actor.usr.id = actor.usr_address.usr;\r
315 \r
316 COMMIT;\r
317 </programlisting>\r
318                                         <para>Placing the sql statements between <code>BEGIN;</code> and <code>COMMIT;</code> creates a transaction block so that if any sql statements fail, the \r
319                                         entire process is canceled and the database is rolled back to its original state. Lines beginning with <code>--</code> are comments to let you you what \r
320                                         each sql statement is doing and are not processed.</para> \r
321                                 </step>\r
322                         </procedure>\r
323                 </simplesect>\r
324                 <simplesect>\r
325                         <title>Batch Updating Patron Data</title>\r
326                         <indexterm><primary>migrating</primary><secondary>batch updating patrons</secondary></indexterm>\r
327                         <para>For academic libraries, doing batch updates to add new patrons to the Evergreen database is a critical task. The above procedures and \r
328                         import script can be easily adapted to create an update script for importing new patrons from external databases. If the data import file contains only new patrons, then, \r
329                         the above procedures will work well to insert those patrons. However, if the data load contains all patrons, a second staging table and a procedure to remove existing                          patrons from that second staging table may be required before importing the new patrons. Moreover, additional steps to update address information and perhaps delete \r
330                         inactive patrons may also be desired depending on the requirements of the institution.</para>\r
331                         <para>After developing the scripts to import and update patrons have been created, another important task for library staff is to develop an import strategy and schedule \r
332                         which suits the needs of the library. This could be determined by registration dates of your institution in the case of academic libraries. It is important to balance \r
333                         the convenience of patron loads and the cost of processing these loads vs staff adding patrons manually.</para>   \r
334                </simplesect> \r
335         </section>\r
336         <section xml:id="electronicresourcesvisible">   \r
337                 <title>Making Electronic Resources Visible in the Catalogue</title>\r
338                 <indexterm><primary>migrating</primary><secondary>importing bibliographic records</secondary><tertiary>electronic resources </tertiary></indexterm>\r
339                 <para>For electronic resources that should be visible in the catalogue without any copies, you must set the source column value in the record.biblio_entry row for the respective \r
340                 bibliographic record to a value that matches the corresponding ID of the <link linkend="config.table.bib-source">config.bib_source</link> where the transcendant value is \r
341                 <emphasis>TRUE</emphasis>. Here's a practical example:</para>\r
342                 <procedure>\r
343                          <step>\r
344                                 <para>Connect to your Evergreen database with <command>psql</command> (substitute username / database name as required):</para>\r
345 <screen>\r
346 <userinput>psql -U postgres evergreen</userinput>\r
347 </screen>\r
348                         </step>\r
349                          <step>\r
350                                 <para>Add a source for your electronic resources:</para>\r
351 <screen>\r
352 <userinput># INSERT INTO config.bib_source(quality, source, transcendant) VALUES (50, 'Institutional repository', TRUE);</userinput>\r
353 </screen>\r
354                         </step>\r
355                          <step>\r
356                                 <para>Find the ID that was generated for your new source:</para>\r
357 <screen>\r
358 <userinput># SELECT ID FROM config.bib_source WHERE source = 'Institutional repository';</userinput>\r
359 </screen>\r
360                         </step>\r
361                          <step>\r
362                                 <para>Update the source column for your bibliographic record for the electronic resource (for the sake of the example, let's assume that the ID returned from the new \r
363                                 source was 4, and that we know that the bib record with ID 75 is an electronic resource from your institutional repository):</para>\r
364 <screen>\r
365 <userinput># UPDATE biblio.record_entry SET source = 4 where biblio.record_entry.id=75;</userinput>\r
366 </screen>\r
367                         </step>\r
368                 </procedure>\r
369         </section>\r
370         <section xml:id="emptydatabase">\r
371                 <title>Restoring your Evergreen Database to an Empty State</title>\r
372                 <indexterm><primary>databases</primary><secondary>restoring Evergreen to an empty state</secondary></indexterm>\r
373                 <secondary>importing bibliographic records</secondary>\r
374                 <para>If you've done a test import of records and you want to quickly get Evergreen back to a pristine state, you can create a clean Evergreen database schema by performing the \r
375                 following:</para>\r
376                 <procedure>\r
377                         <step> \r
378 <screen>\r
379 <userinput>cd ILS/Open-ILS/src/sql/Pg/</userinput>\r
380 </screen>\r
381                         </step>\r
382                          <step> \r
383                                  <para>Rebuild the database schema:</para>\r
384 <screen>\r
385 <userinput>./build-db.sh [db-hostname> [db-port] [db-name] [db-user] [db-password] [db-version]</userinput>\r
386 </screen>\r
387                                 <caution><para>This will remove all of your data from the database and restore the default values.</para></caution>             \r
388                         </step>\r
389                 </procedure>\r
390         </section>\r
391 </chapter>\r