Evergreen ILS Website

IRC log for #evergreen, 2014-02-06

| Channels | #evergreen index | Today | | Search | Google Search | Plain-Text | summary | Join Webchat

All times shown according to the server's local time.

Time Nick Message
00:35 pinesol_green [evergreen|Dan Scott] Restore OpenSearch support and use TPAC search - <http://git.evergreen-ils.org/?p=​Evergreen.git;a=commit;h=a1cfeda>
00:35 pinesol_green [evergreen|Ben Shum] OpenSearch release note - <http://git.evergreen-ils.org/?p=​Evergreen.git;a=commit;h=746bd28>
01:05 pinesol_green [evergreen|Jeff Godin] Skip duplicate username check when username unchanged - <http://git.evergreen-ils.org/?p=​Evergreen.git;a=commit;h=dd576d0>
01:55 ktomita_ joined #evergreen
01:55 sseng_ joined #evergreen
03:42 fparks_ joined #evergreen
05:32 artunit_ joined #evergreen
05:55 * csharp would be pretty strongly against using the auditor tables for anything other than auditing purposes
08:22 akilsdonk joined #evergreen
08:29 finnx joined #evergreen
08:33 jl- joined #evergreen
08:40 Shae joined #evergreen
08:40 jl- we're trying to import our marc records but we get this error
08:40 jl- ERROR:  Attempted to INSERT MARCXML that is not well formed
08:40 jl- CONTEXT:  SQL statement "INSERT INTO biblio.record_entry (marc, last_xact_id) VALUES (stage.marc, 'IMPORT')"
08:43 dluch joined #evergreen
08:45 dbs csharp: fair enough; for a general purpose feature, auditor would be out. but for a local implementation, that would be the fastest way to build on what's there
08:45 dbs jl-: you're going to need to figure out which of the stage.marc records is malformed XML
08:50 dbs jl-: something like "SELECT id FROM staging_table WHERE xml_is_well_formed(marc) IS NOT TRUE;" would be a good start
08:52 mmorgan joined #evergreen
08:56 artunit joined #evergreen
09:08 timhome joined #evergreen
09:17 csharp dbs: that sounds fine to me
09:21 ericar joined #evergreen
09:22 phasefx I still miss the "transparencies" idea
09:29 bradl man, that was a good idea... like a lot of other good ideas we had that we didn't have time to do
09:35 * tsbere wonders what the "transparencies" idea is/was
09:36 dbs tsbere: transparencies as in overlays IIRC
09:37 * dbs wonders just how badly other systems show "EXPLAIN ANALYZE SELECT (SELECT record FROM authority.simple_heading WHERE id = func.heading) FROM authority.axis_browse_center('subject', 'civilization aegean', '0', '20') AS func(heading);"
09:37 dbs we're getting some pretty horrible results on 2.4
09:38 tsbere dbs: What is "horrible" in this case?
09:39 * tsbere got "(cost=0.25..84.49 rows=10 width=8) (actual time=1312.835..1358.128 rows=20 loops=1)" on MVLC's system
09:39 dbs seems to be taking forever (minutes) to return; right now digging into the nest of functions and wondering about the performance of simple_heading_find_pivot
09:39 dbs tsbere: how many authority records do you have?
09:39 dbs we have a couple million
09:40 tsbere Looks like 578886 rows in authority.record_entry
09:40 dbs hrm
09:42 collum joined #evergreen
09:54 yboston joined #evergreen
09:55 kmlussier Does anyone want to add their names to the Doodle poll for next week's dev meeting? http://doodle.com/shbpr649h6xint7r
09:55 kmlussier Best day for the meeting is still a toss up.
10:04 remingtron joined #evergreen
10:05 dbs SELECT  * -- bib search: #CD_documentLength #CD_meanHarmonic #CD_uniqueWords  keyword:  estimation_strategy(inclusion) site(1) limit(2500) core_limit(62500)
10:05 dbs That's a bit of a performance buster.
10:07 dbs Apparently QP allows null searches, which turn into "WHERE 1 = 1 AND (TRUE)", which will seqscan your biblio.record_entry table (I assume)
10:27 j_scott joined #evergreen
10:33 * dbs wonders if we're going to treat bug 1243023 and bug 1253163 as blockers
10:33 bshum dbs: How would you like to proceed with https://bugs.launchpad.net/evergreen/+bug/874296 ?
10:33 pinesol_green Launchpad bug 1243023 in Evergreen "Browse catalogue titles are doubly escaped?" (affected: 1, heat: 6) [Undecided,New] https://launchpad.net/bugs/1243023
10:33 pinesol_green Launchpad bug 1253163 in Evergreen "authority indexes can fail on Postgres 9.3.0" (affected: 2, heat: 12) [Undecided,Confirmed] https://launchpad.net/bugs/1253163
10:33 pinesol_green Launchpad bug 874296 in Evergreen "Replace ARRAY_ACCUM with ARRAY_AGG (and STRING_AGG)" (affected: 2, heat: 10) [Medium,Confirmed] - Assigned to Ben Shum (bshum)
10:33 bshum Ack, hehe
10:35 dbs bshum: I would like to make 874296 happen!
10:35 bshum Also, thanks for pointing out my wacky find and replace job.
10:35 dbs No, you found lots more good stuff!
10:41 bshum Okay, well I'll be back in about 45 minutes or so.  I'd like to try getting the upgrade SQL worked out for those changes and getting them into master today if we can.
10:41 bshum Since you worry about shuffle, I'd like to kill this old bug once and for all before we tackle more things.
10:42 * bshum wanders off for a bit
10:42 Dyrcona joined #evergreen
10:47 rjackson-isl joined #evergreen
10:47 * Dyrcona is working from home again.--Stupid car.
10:57 dbwells dbs: your authority speed problem sounds really similar to bug #1244432.  Maybe like the metabib pivot functions, authority.simple_heading_find_pivot() needs to be 'stable' as well?
10:57 pinesol_green Launchpad bug 1244432 in Evergreen "Browse search functions need to be 'stable'" (affected: 1, heat: 8) [Critical,Fix released] https://launchpad.net/bugs/1244432
10:58 dbwells (or some other functions in that chain)
10:59 dbs That seems very likely
10:59 dbs dbwells++
11:04 RoganH joined #evergreen
11:29 smyers_ joined #evergreen
11:40 mllewellyn joined #evergreen
11:41 Dyrcona Well, I actually finished something for a change!
11:41 Dyrcona working_from_home++
11:42 Dyrcona I think I'll build a new dev vm so I can start on something else with a clean slate tomorrow.
11:43 jwoodard joined #evergreen
12:23 jihpringle joined #evergreen
12:24 bshum csharp: Hmm, did the Evergreen 2011 site go away?
12:26 bshum Also, hmm
12:26 bshum The spinner is missing from catalogs that have autosuggest enabled.
12:26 bshum But is there on ones I have disabled with
12:26 bshum "the spinner" being that little animation that takes away the search button whenever you initiate a search.
12:27 * bshum tries to look at that now.
12:41 ericar joined #evergreen
12:48 eeevil bshum: does the spinner show up if you actually click the Go! button instead of selecting an autosuggestion?
12:48 csharp bshum: I'll look into it - we updated the PINES site and that must've been attached to it :-/
12:49 bshum eeevil: Oh interesting.  If you click on the button, then yes, it does change to the spinner.  If you click on a suggestion or hit enter, it doesn't.
12:49 bshum I was hitting enter.
12:49 eeevil yeah, sounds like a difference in how the form is triggered, and the spinner being tied (probably) to the onClick of the button
12:55 bshum Yeah that makes sense now.
12:56 bshum Not sure I'm happy with it, but I'll add it to the long list of "things to figure out before we ever use autosuggest in production" :\
12:59 hbrennan joined #evergreen
13:02 smyers__ joined #evergreen
13:11 jeff Hrm. Normally when I look at transits where source and dest are equal, they are holds captured as transits by a SIP device.
13:11 jeff These normally have an atc.copy_status of 8 (on holds shelf)
13:13 jeff But I see one that has an atc.copy_status of 7 (reshelving), and I have no entries in logs matching the copy id or copy barcode on the day of the transit's source_send_time...
13:13 jeff no, scratch that, looked on the wrong day
13:13 jeff (well, right day, wrong month)
13:14 jeff I'm going to guess that it was captured for a hold, but the transit was modified when the hold was cancelled? Now that I have the correct day, I can probably find out.
13:15 bshum Maybe it's an aborted transit.
13:16 bshum I think aborting would put it back to 7.
13:17 jeff wouldn't aborting the transit set a dest_recv_time?
13:17 jeff actually ending the transit?
13:17 jeff closing, ending, completing... your pick. :-)
13:18 bshum Hmm, that does sound like it should ;)
13:18 bshum hold_transit_copy or transit_copy?
13:18 bshum (doesn't matter really, just curious to peek at our DB)
13:20 jeff patron cancelled hold via opac while item was in transit.
13:20 jeff action.transit_copy
13:21 tsbere For fun: Aborted transits are, to my knowledge, *deleted*
13:21 bshum That's what I thought too
13:21 jeff i was looking at self-transits "in flight" using: select * from action.transit_copy where source = dest and dest_recv_time is null order by source_send_time asc;
13:21 jeff tsbere: thanks. i think i remember that also.
13:22 bshum jeff: Where do you see status?  The field in the table?
13:22 bshum Maybe that's just storing the original status before it went into transit.
13:22 jeff bshum: the status of the copy should be in-transit, but i was looking at action.transit_copy.copy_status for the preserved status.
13:23 bshum To set it back when it arrives
13:23 bshum Oh okay, so I do see it right
13:23 jeff And I think it's not so much that the copy ever had that status, but that it should be put in that status when the transit completes.
13:24 jeff because to the best of my knowledge and (weakly) confirmed by a (brief) look at the audit tables, the copy doesn't actually get put into the status shown in action.transit_copy.copy_status before asset.copy.status is set to 6 (in transit)
13:25 jeff that behavior might be different for different transits. dunno.
13:26 tsbere It should be "what status the copy should be put into post-receive"
13:26 tsbere That could be the previous status (copy randomly ended up somewhere else and is going back home) or a new one (copy was returned to a different library or was put into transit to fill a hold)
13:26 jeff in theory the item would have been scanned when it reached the proper area of the library, and would then be placed into a reshelving status because its hold was cancelled, but in this case the item never made it to the proper place in the library (I suspect it was placed on a cart, then back on the shelves)
13:27 Dyrcona jeff: You mean staff have missed a checkin scan? We see that from time to time.
13:27 jeff doing pretty good. only six self-transits still in transit that started before today.
13:29 jeff Dyrcona: in this case, it's more likely that staff for whatever reason overrode the sorter and said "that shouldn't have gone into the exception bin, it's a youth item!" and it never made it to being scanned to capture the hold.
13:30 Dyrcona jeff: Ok. That makes sense, too. One of our libraries has a sorter, but tsbere would be more familiar with their issues than I am.
13:30 jeff we see that from time to time where a stack of either very thin books or a multi-tag "kit" goes into the exception bin, and staff think it was in error (because sometimes it is), not realizing that in this case, it was because there was a hold.
13:30 tsbere We have 7 with no send time (mirgrated transits, I assume <_<), 6 from 2011 or earlier, and 5 from last month. <_< (on the self transits front)
13:31 jeff I'm having this be a report that's sent via email (when there's any output, skipped when zero output) so that staff can go find the items and capture them so that the hold goes available, etc.
13:31 afterl joined #evergreen
13:31 jeff That way we'll have one or two every couple of weeks, as opposed to reporting and seeing that there are six dating back to May.
13:32 jeff (six total, at least one of which dates back to may)
13:32 jeff not a big problem, but one that should be dealt with more quickly when it comes up.
13:58 Dyrcona configure: error: ***OpenSRF requires ncurses development headers
14:00 Dyrcona Oh. I ran the prerequisites for the wrong target: ubuntu-lucid instead of ubuntu-precise.
14:01 Dyrcona Think I'll just blow the VM away and start over.
14:02 stevenyvr joined #evergreen
14:05 bshum We should add a branch to nuke that target out.
14:13 jeff For those of you that work with external authority files -- from what sources do you obtain them?
14:14 csharp jeff: BSLW in our case, and Dyrcona's
14:14 Dyrcona yes, what csharp said. :)
14:14 csharp PINES used to work with Marcive before that
14:15 bshum Someday, when the dust settles, I think we have an ongoing project with BSLW to get going with authorities too.
14:15 jeff we have used marcive before. scope of that was mostly a one-time cleanup project.
14:16 bshum What is this openils_dojo.js file that keeps erroring out in my console?  I can't find it anywhere, is it custom stuff that doesn't exist anymore?
14:18 jeff iirc, the dojo "bundle" which can be created. it's used to pre-load most of the things that are dojo.require'd
14:19 bshum Oh, I found it now
14:19 jeff see Open-ILS/examples/openils.profile.js
14:19 bshum Went back into my old emails.  I see it.
14:19 bshum https://bugs.launchpad.net/evergreen/+bug/1076582 disappeared into the weeds of "opinion"
14:19 pinesol_green Launchpad bug 1076582 in Evergreen "Orphan references to openils_dojo.js should be removed" (affected: 1, heat: 8) [Low,Opinion]
14:19 bshum Apparently because I put it there .... oy.
14:24 bshum I'm changing the bug and marking it as documentation.
14:24 bshum Just to make it more likely to get somewhere real for the time being till the solution is finalized.
14:27 bshum eeevil: dbwells: Opinions on backporting the fix for https://bugs.launchpad.net/evergreen/+bug/1164720 to 2.4 and 2.5?
14:27 pinesol_green Launchpad bug 1164720 in Evergreen "Do not allow a list to be created with no name" (affected: 2, heat: 10) [Wishlist,Fix committed]
14:31 Dyrcona I would generally say "no" to backporting wishlist items.
14:32 Dyrcona But that might not be a wishlist depending on your point of view.
14:33 bshum That's kind of what I'm wrestling with.
14:33 bshum I can't remember now why I marked it as wishlist.
14:34 dbwells bshum: It seems like a legit bugfix to me, I've got no problems with it being backported.
14:35 * eeevil looks
14:38 eeevil bshum: if you can pick it clean, I'm ok with it coming back to 2.4, but I wouldn't worry about it if there are any scary conflicts
14:40 bshum Alrighty, I'll go give those a whirl.  Thanks guys!
14:42 bshum It's in both branches now.
14:48 jl- trying to convert marc -> evergreen bre json,
14:48 jl- currently the format is   <record xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
14:48 jl- xsi:schemaLocation="http://www.loc.gov/MARC21/slim
14:48 jl- http://www.loc.gov/standards/​marcxml/schema/MARC21slim.xsd">
14:49 csharp okay - I'm trying to manually run the action.find_hold_matrix_matchpoint function to see which hold policies are chosen given a set of parameters, and I'm getting the error "function find_hold_matrix_matchpoint(integer, integer, integer, integer, integer) does not exist"
14:50 csharp and when I try to cast a copy ID as bigint, I get the same error
14:50 csharp any tips on getting that to work?
14:53 csharp oh hmm - when I cast copy id as bigint I get "function find_hold_matrix_matchpoint(integer, integer, bigint, integer, integer)"
14:53 jeff can you show an example of how you're calling it when you attempt to cast to bigint for the third argument?
14:54 Dyrcona csharp: Are you psql or PgAdmin?
14:54 Dyrcona using
14:54 gmcharlt and are you qualify the function name with the schema?
14:54 jeff select action.find_hold_matrix_match​point(23,23,123::bigint,1,1); works for me.
14:54 eeevil bshum++
14:54 gmcharlt i.e., action.find_... rather than find_...
14:55 csharp gmcharlt: that was it - duh
14:55 csharp I figured it out at the same time you responded - thanks
14:55 eeevil jl-: your marcxml is lacking the standard (likely required by the tools) xml namespace
14:55 csharp jeff++ gmcharlt++
14:56 csharp didn't end up needing the cast either, btw
14:56 eeevil jl-: unless you have a <collection> wrapper that supplies the default ns, of course
15:05 jl- eeevil: let me give u a sample record
15:05 jl- eeevil: http://paste.debian.net/hidden/424dfe0f/
15:05 jl- I'm wondering what convertign needs to be done pre import
15:09 Dyrcona jl-: That is the first record from a collection?
15:12 jl- Dyrcona: yes
15:12 jl- do you need more?
15:12 Dyrcona jl-: No, it explains the missing </collection>.
15:12 jl- right
15:12 Dyrcona jl-: It looks like your records are missing the namepace declaration, and we don't normally use the xsi schema stuff, but it should be ignored.
15:13 jl- I figgured testing with 1 record will be easier
15:13 jl- than with over a million
15:13 Dyrcona If that is exactly what you are using, either remove the <collection> at the top or add </collection> at the bottom.
15:14 Dyrcona What errors are you getting?
15:15 jl- Dyrcona: well do I need to do any encoding before importing?
15:15 jl- like UFT-8
15:15 jl- *UFT
15:15 jl- *UTF
15:16 Dyrcona If the records aren't UTF-8 to begin with, then yes.
15:16 Dyrcona That one claims to be UTF-8.
15:17 Dyrcona You see the a in the leader before the 2200229? That says, I'm UTF-8.
15:18 jl- interesting
15:20 Dyrcona If it is blank there, it supposed to MARC8, but I've seen records in ISO-8859-1.
15:29 bshum Syndetics and UPC.  But on 2.4.  Too soon Executus... too soon
15:30 jl- do I run these commands as postgres?
15:31 smyers_ joined #evergreen
15:36 Dyrcona jl-: What commands? You using the scripts from Evergreen/Open-ILS/src/extras/import?
15:48 jbfink joined #evergreen
15:50 rfrasur joined #evergreen
15:51 jl- Dyrcona: ok I have the file saved as 1record.bib , I'm connected to the db as evergreen
15:52 jl- in the example in the document I see the records as gutenberg.marc
15:52 jl- is that a problem?
15:52 Dyrcona What do you expect to do with it directly in the database?
15:52 jl- good question
15:53 Dyrcona jl-: Normally, you would run your records through the scripts in the director that I pointed out above.
15:53 Dyrcona What instructions are you following, can you paste the URL?
15:55 jl- sorry i was thrown off by something the user prior to me did
15:55 jl- following http://wiki.evergreen-ils.org/doku.php?​id=evergreen-admin:importing:bibrecords
15:57 jl- I don't have a .bre file tho
15:58 jl- and my .marc file is .bib
16:00 Dyrcona jl-: That shouldn't matter. If you follow the steps under the example and change the file names, etc. to suit your situation. It should work.
16:01 Dyrcona You also want to skip the step about setting the bib_source to 3. That is specifically for project gutenberg records.
16:02 jeff > Congratulations, your application for access to OverDrive?s new API?s has been approved.
16:03 jeff they thanked me for my patience.
16:03 jeff i look forward to giving it a spin.
16:04 Dyrcona jeff: Does everyone who wants to use the API have to go through this, or just devs who want to develop for the API?
16:05 jeff anyone who needs an api key. the long delay in our case was supposedly because we use ezproxy to auth patrons to overdrive.
16:05 jeff i am hoping when i dig into it that it will not turn out to have been done in an unsuitable fashion.
16:05 Dyrcona So, in other words, both and everyone.
16:05 jeff because as it stood, i couldn't figure out why it required anything special.
16:19 jl- Dyrcona: !!! TCN  is already in use, using TCN (s228) derived from System ID.
16:20 csharp bshum: http://pines.georgialibraries.org/evergreen2011/ back up
16:21 Dyrcona jl-: Did you do the step about getting your max(id) from biblio.record_entry and passing it +1 to --startid on marc2bre.pl?
16:21 Dyrcona The gutenberg instructions assume an empty database, I think.
16:21 jl- no I didn't do that
16:22 Dyrcona It has been three or four years since I last used these tools. I usually do something bespoke for every set of records or record sources that I have to deal with.
16:25 jl- Dyrcona: sorry, what needs to be done?evergreen=# SELECT MAX(id)+1 FROM biblio.record_entry;
16:25 jl- ?column?
16:25 jl- ----------
16:25 jl- 228
16:25 jl- (1 row)
16:26 Dyrcona start over with marc2bre.pl and when you run add --startid=228 to your options.
16:26 Dyrcona If the 228 records don't matter and you're using the system for learning, you could just reload the database, too.
16:27 jl- yes that would be fine
16:27 Dyrcona Of course the records might not be there and the sequence for the id just got incremented with failed inserts.
16:27 jl- somebody tried inserting records before
16:28 jl- now I'm trying my luck
16:29 jl- sudo perl marc2bre.pl --startid=228 --db_user evergreen --db_pw censored -db_name evergreen /home/postgres/1record.bib > ~/1record.bre
16:29 jl- !!! TCN  is already in use, using TCN (s228) derived from System ID.
16:29 jl- well, seems like it did import, but added an s?
16:30 jl- indeed there's a 1record.bre
16:32 Dyrcona jl-: There's usually a record with id -1 for when a book is checked out that isn't already in the system, like how a lot of libraries don't put paperbacks in the catalog.
16:32 dbs jl-: I can't remember the last time I've used marc2bre; I would go with something like http://docs.evergreen-ils.org/2.5/_mig​rating_your_bibliographic_records.html
16:44 hbrennan Any Equinox people here to tell me whether you'll be staffed on Feb 17? Washington's Birthday?
16:46 hbrennan Enough to aid in an upgrade?
16:48 jl- Dyrcona: step 3 runs fine, however there is no .sql file for 4.
16:49 Dyrcona jl-: Like dbs pointed out above, those scripts are kind of old and the link he provided is what is recommended now.
16:49 jl- yes but I'm already pretty far in
16:50 phasefx hbrennan: we're open that day (everday for an emergency)
16:50 jl- 2 steps from success hopefully
16:50 hbrennan phasefx: Thanks. Just thinking of possible opportunities. A holiday for us, but might be nice to upgrade while it's quiet. Thanks again!
16:51 jeffdavis jl-: if you're following the (old) steps on that wiki page, the output of pg_loader.pl should be one or more sql files
16:52 phasefx hbrennan: you're welcome. :)
16:52 jl- jeffdavis: look at the code for 3.
16:52 jl- there is no .sql file specified as output
16:53 jl- also, for 2 I get this
16:53 jl- postgres@evergreendev:/openils/Evergreen-ILS​-2.5.1/Open-ILS/src/extras/import$ perl direct_ingest.pl 1record.bre > ~/1record.ingest
16:53 jl- We have no more use for authority or biblio ingest ... just insert the are or bre objects and you're done!
16:53 jl- not sure if that's a default message
16:54 eeevil jl-: right. just use pg_loader.pl, or something like the docs that Dyrcona pointed you at
16:55 jl- pg_loader.pl -or bre -or mrd -or mfr -or mtfe -or mafe -or msfe -or mkfe -or msefe -a mrd -a mfr -a mtfe -a mafe -a msfe -a mkfe -a msefe --output=1record < ~/1record.ingest
16:55 jl- Writing file ...
16:55 jl- Can't use an undefined value as a symbol reference at pg_loader.pl line 83.
16:55 eeevil just "-or bre"
16:56 Dyrcona jl-: Did you get a file named 1record?
16:56 Dyrcona That's probably your sql file without the extension. It won't add it.
16:57 jl- postgres@evergreendev:/openils/Evergreen-ILS​-2.5.1/Open-ILS/src/extras/import$ perl pg_loader.pl -or bre --output=1record < ~/1record.ingest
16:57 jl- Writing file ...
16:57 jl- Can't use an undefined value as a symbol reference at pg_loader.pl line 83.
16:57 jl- sec
16:57 jl- the < should be >
16:58 jl- now it's working
16:58 jl- do I need the -a parameters?
16:58 jl- I took them out
16:59 Dyrcona jl-: Those are different types of database entries that will get created. You can usually create them later by running some indexer scripts.
16:59 eeevil you don't need those
17:00 jl- taking a long time for just a single record
17:00 jl- :)
17:00 * dbs advocates deleting the page jl- is looking at in the wiki, but historical significance yada yada
17:02 jl- it's still going
17:06 jl- hmm
17:08 jl- it says This step will take a while to order the output properly (all those -or options) to avoid missing foreign keys before it actually dumps any content into gutenberg.sql - be patient :)
17:08 jl- but since I have only 1 record.. this is bizzare
17:16 gdunbar joined #evergreen
17:17 mmorgan left #evergreen
17:22 jl- is my record MARC21 or MARCXML?
17:22 jl- <collection>
17:22 jl- <record xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
17:22 jl- xsi:schemaLocation="http://www.loc.gov/MARC21/slim
17:22 jl- http://www.loc.gov/standards/​marcxml/schema/MARC21slim.xsd">
17:28 Dyrcona jl-: That is MARCXML. MARC21 is binary.
17:35 dcook joined #evergreen
17:35 jl- then I could just go to step 1 on http://docs.evergreen-ils.org/2.5/_mig​rating_your_bibliographic_records.html ?
17:38 Dyrcona You can try it. I've never used that method.
17:41 Dyrcona I'm going to reboot my laptop, 'cause that seems to be the only thing that fixes these bugged Red Bull ads on Youtube.
17:43 jl- dbs: I tried your method with your link
17:43 jl- I'm getting this error:   evergreen=# COPY staging_records_import (marc) FROM '/home/postgres/records.bib';
17:43 jl- ERROR:  invalid byte sequence for encoding "UTF8": 0x92
17:43 jl- CONTEXT:  COPY staging_records_import, line 16654689
18:18 Dyrcona joined #evergreen
18:18 Dyrcona @lovehate Google
18:18 pinesol_green Dyrcona: Yeah, well, you know, that's just, like, your opinion, man.
18:31 dcook hehe
18:32 Dyrcona I you have G+, you unlucky person, check out the latest post in my stream for why.
18:46 dcook O_o
18:55 eby joined #evergreen
19:00 fredp_ joined #evergreen
19:37 fredp__ joined #evergreen
19:53 jl- ok I converted to UTF8 but then I get re
19:53 jl- Dyrcona: still there? :)
20:01 Dyrcona jl-: You shouldn't have to convert to UTF-8 the example you showed says it is already UTF-8.
20:06 finnx joined #evergreen
20:06 jl- Dyrcona: I think I got it right now
20:07 jl- and yes you are correct
20:07 Dyrcona jl-: Ok. That's good.
20:10 jl- Dyrcona: I'm finished with step 5 as seen here http://wiki.evergreen-ils.org/doku.php?​id=evergreen-admin:importing:bibrecords
20:10 jl- now you said about 6. I shouldn't do that
20:10 jl- or was that the step right below 6.
20:12 Dyrcona Step 6, the update of biblio.record_entry.
20:12 Dyrcona There is a source in the database for gutenberg records. It has ID 3.
20:12 jl- I actually didn't use gutenberg records
20:13 Dyrcona You only want to do that step if all of the records in your database come from project gutenberg.
20:13 jl- I used one of our own
20:13 Dyrcona ok.
20:13 jl- the record that I pasted is from a previous system
20:13 jl- so how can I make it visible in the online catalog?
20:14 Dyrcona You usually have to add copies.
20:14 jl- can I do it without copies?
20:14 jl- this is for demonstration purposes
20:15 Dyrcona Yeah, if you make the source transcendent or if you have URIs in the 856 that can be targeted at a location.
20:15 Dyrcona That would the be 856 in the MARC record.
20:15 Dyrcona Are these regular books and DVDs and things like that?
20:15 jl- Dyrcona:  yes
20:16 jl- they are
20:16 Dyrcona If you don't want to add copies, then you make the source transcendent.
20:16 jl- how do I do that?
20:17 Dyrcona I would do it in the database.
20:17 jl- evergreen=# select * from config.bib_source;
20:17 jl- id | quality |      source       | transcendant | can_have_copies
20:17 jl- ----+---------+-------------------​+--------------+-----------------
20:17 jl- 1 |      90 | oclc              | f            | t
20:17 jl- 2 |      10 | System Local      | f            | t
20:17 jl- 3 |       1 | Project Gutenberg | t            | t
20:17 Dyrcona update conf.bib_source set transcendant = 't' where id = X.
20:17 jl- hmm
20:17 Dyrcona Yeah, just replace X with the id for the source you used.
20:18 Dyrcona Or you could make them all source 3. :)
20:18 jl- i don't even know what source I used it was only 1 record
20:18 jl- I don't remember entering any of those sources
20:19 Dyrcona jl-: They come with the default installation.
20:20 Dyrcona select id, source from biblio.record_entry;
20:20 Dyrcona try that.
20:21 jl- id  | source
20:21 jl- -----+-------- -1 |      1
20:21 jl- followed by 120 empty rows
20:23 jl- Dyrcona
20:24 Dyrcona You didn't actually get a record in if you don't get any output other than -1, 1.
20:24 jl- hmm
20:24 Dyrcona -1 comes with the Evergreen installation and has some internal uses for pre cataloged circulations.
20:26 Dyrcona Have you tried the other method that dbs pointed out earlier?
20:27 Dyrcona Those instructions are more recent and more frequently used.
20:27 jl- yes but it complained about UTF8
20:27 jl- well
20:27 jl- and then afterwards about malformed xml
20:28 jl- evergreen=# select staging_importer();
20:28 jl- ERROR:  Attempted to INSERT MARCXML that is not well formed
20:28 jl- CONTEXT:  SQL statement "INSERT INTO biblio.record_entry (marc, last_xact_id) VALUES (stage.marc, 'IMPORT')"
20:28 jl- PL/pgSQL function "staging_importer" line 5 at SQL statement
20:30 jl- one more question for you
20:30 Dyrcona Does your file still have the <collection> at the top and no </collection> at the bottom?
20:30 jl- earlier I used yup
20:31 jl- http://paste.debian.net/80588/
20:31 jl- here it is
20:31 jl- well it does have </collection>
20:32 jl- does the file look alright to you?
20:32 Dyrcona You should try xmllint on it.
20:33 Dyrcona You also might want to remove the <collection> tags.
20:34 Dyrcona The software may also be looking for the namespace directive, which should be in <collection> tag.
20:35 jl- earlier I did this: sudo perl marc2bre.pl --startid=228 --db_user evergreen --db_pw censored -db_name evergreen /home/postgres/ship.bib > ~/1record.bre -- where did the 228 come from?
20:35 jl- there were 228 rows in biblio.record_entry
20:38 Dyrcona jl-: Not necessarily. When you try to insert into a table with a sequence, the sequence updates even if no rows are actually inserted.
20:39 Dyrcona select count(*) from biblio.record_entry;
20:39 Dyrcona That will tell you how many rows there are in the table.
20:40 jl- where did we get the 228 from?
20:40 jl- evergreen=# select count(*) from biblio.record_entry;
20:40 jl- count
20:40 jl- -------
20:40 jl- 228
20:40 jl- (1 row)
20:41 Dyrcona Then you do have 228 rows and should have gotten more output from the previous query.
20:43 jl- why dod we do --startid=228?
20:43 Dyrcona So, try the select id, source query again.
20:44 Dyrcona Because the old method will start at 1, and if you already have bib records in the database, you'll get duplicate key row errors.
20:44 jl- gotcha
20:45 Dyrcona Have you loaded records before or loaded the sample data?
20:46 jl- no this is my first time loading records, someone else tried loading records earlier today but failed
20:47 jl- the select id, source returns the same result
20:47 jl- -1 | 1
20:47 jl- and empty rows
20:50 Dyrcona Something is very wrong, because you should not have rows in biblio.record_entry without ids. The id column does not allow null values.
20:50 Dyrcona I suggest you blow the database out and start over.
20:50 jl- yeah thats an idea
20:51 jl- when I do: root@evergreendev:/openils/Evergreen-ILS​-2.5.1/Open-ILS/src/extras/import# sudo perl marc2bre.pl --startid=999 --db_user evergreen --db_pw 3v3rGr33n -db_name evergreen /home/postgres/1record.marc > ~/1recordb.bre
20:51 jl- !!! TCN  is already in use, using TCN (s999) derived from System ID.
20:51 jl- is that concerning?
20:53 Dyrcona jl-: I don't know what's happened 'cause I can't see your database directly, but what you are describing sounds the database is fubar.
20:53 dac joined #evergreen
20:56 Dyrcona I suggest dropping the database and recreating it.
21:10 jl- Dyrcona: let's try the other method that dbs mentioned for a sec
21:11 Dyrcona Go ahead, but I wouldn't do anything until the database is reloaded.
21:11 jl- if you have time
21:11 Dyrcona I'm going to bed, soon, so I can't really stick around.
21:12 jl- no prob, you've already helped me so much
21:12 jl- thanks a lot
21:12 jl- I
21:12 jl- ll be back tmrw
21:14 Dyrcona yw
21:16 jl- btw, you say the records a utf8 but I do get that message when I try dbs method
21:18 jl- ERROR:  syntax error at or near "AS"
21:18 jl- LINE 1: ...TE OR REPLACE FUNCTION staging_importer() RETURNS NULL AS $$
21:20 jl- changed it to VOID
21:21 Dyrcona jl-: That message doesn't have anything to do with the format of your records.
21:22 Dyrcona I think you'll get more help when it is daylight hours on the US East Coast.
21:25 jl- :)
21:28 jl- vergreen=# COPY staging_records_import (marc) FROM '/home/postgres/ship.bib';
21:28 jl- ERROR:  invalid byte sequence for encoding "UTF8": 0x92
21:28 jl- but as you said, they are utf8
21:31 Dyrcona they say they are UTF-8, doesn't mean that they are.
21:32 Dyrcona That's one of the joys of MARC.
21:38 Dyrcona Well, good night, #evergreen!
23:24 timhome_ joined #evergreen
23:28 zerick joined #evergreen

| Channels | #evergreen index | Today | | Search | Google Search | Plain-Text | summary | Join Webchat