12:57 |
Freddy_Enrique |
Dyrcona, in order to make my record appear on the OPAC, I must create an item first right? |
12:58 |
Dyrcona |
Freddy_Enrique: Usually, yes. |
12:58 |
Freddy_Enrique |
I made a clean instalation, I created the org units, created some users (with staff permissions) and then went directly with the records |
12:58 |
Dyrcona |
Or a located URI, i.e. a URL in a 856 MARC tag with a subfield $9 and a couple of other conditions. |
12:58 |
Freddy_Enrique |
i could see them in the xul |
12:59 |
Freddy_Enrique |
but I could not visualize it on the opac |
12:59 |
Freddy_Enrique |
! |
13:09 |
Freddy_Enrique |
Marcedit software can help me with that |
13:09 |
Dyrcona |
How are you importing your records? |
13:10 |
Freddy_Enrique |
Well, I had some experience with migration. But, the final format of my excel records ended with *mrc. |
13:10 |
Dyrcona |
Assuming that you're coming from an old system, it would be good if you can get it to put the information you need into some non-standard MARC tag during the export. |
13:11 |
Freddy_Enrique |
here if I'm not mistaken, is marcxml |
13:12 |
Freddy_Enrique |
uhm.... what if the records are contained in csv? would be much easier? |
13:12 |
Dyrcona |
Well, I don't know. That depends on what you're starting with. |
13:13 |
Dyrcona |
Here's one way to do it: http://docs.evergreen-ils.org/2.12/_migrating_your_bibliographic_records.html |
13:13 |
Freddy_Enrique |
I have my records in many formats, just to clarify those doesnt go beyond the 1000 |
13:14 |
Dyrcona |
When I said "records" I meant the MARC records, and not other records. |
13:14 |
Dyrcona |
I've done a migration or 4, and it depends a lot on what you're starting with and what access you have to the original system. |
13:16 |
Freddy_Enrique |
Here there are maaany libraries that... have been working with just excel |
13:17 |
Dyrcona |
So, they've been using Excel to track their patrons, items, and transactions? |
13:19 |
Freddy_Enrique |
marc records then... I have also exported marc record with *mrk extension |
13:20 |
Freddy_Enrique |
mrc and csv |
13:20 |
Freddy_Enrique |
Thanks Dyrcona, I'll really need it |
13:20 |
Dyrcona |
mrk is usually a special format called "MARC Breaker" format. |
13:21 |
Dyrcona |
It is used by MARCEdit, MARC Breaker, and some other programs. |
13:21 |
Freddy_Enrique |
With this format I can edit the marc records. The mrc is just for importing |
13:21 |
Dyrcona |
Evergreen wants either standard MARC 21 (usually .mrc) or MARCXML (usually .xml). |
13:22 |
Dyrcona |
Yes, that's right. |
13:22 |
Freddy_Enrique |
Then...If I have the mrc file, I have most of the work done? |
13:23 |
Dyrcona |
Well, if you can get the item information in the mrc file, it would be easier to import item and bibliographic information at once. |
13:23 |
Dyrcona |
You can do that in the staff client import feature, called Vandelay. |
13:24 |
Dyrcona |
How are you getting the MARC from the CSV? Are you looking the records up somewhere via ISBN? |
13:25 |
Freddy_Enrique |
Nop, I use MarkEdit to convert my csv files to mrk files |
13:26 |
Freddy_Enrique |
when I'm done polishing the marc records, I finally convert it to mrc |
13:27 |
Dyrcona |
I don't use MARCEdit, so I didn't know it could convert csv to MARC, but I guess it wouldn't be too hard to make a rudimentary record. |
13:29 |
|
kmlussier joined #evergreen |
13:29 |
Freddy_Enrique |
It is possible, there I can also add a field for the copies/items. In other system, the field is 952 |
13:29 |
Freddy_Enrique |
Is it the same for Evergreen? |
09:40 |
miker |
bonus points for cut+parallel |
09:40 |
miker |
er, split, not cut |
09:40 |
bos20k |
miker: I think that code is in 0964.data.electronic-resources.sql |
09:41 |
Dyrcona |
bos20k: Going to 2.12, you'll probably want to skip that though and just update the records in a transaction with reignest on same marc set to true. |
09:41 |
miker |
bos20k: so it is! |
09:42 |
Dyrcona |
There's more than just a metabib reingest required, if you want the new goodies, like 901$s. |
09:42 |
bos20k |
miker: is that faster than the SELECT in 0967.data.more_fixed_fields.sql? |
13:11 |
Dyrcona |
Nope. |
13:12 |
bos20k |
er, metabib.reingest_metabib_full_rec() that is |
13:12 |
jeff |
in the above, my "full" was referencing the style of ingest where you set appropriate flags and execute a query or queries that cause the triggers on biblio.record_entry to fire for all records. |
13:12 |
Dyrcona |
update biblio.record_entry set marc = marc where.... |
13:12 |
Dyrcona |
Or similar. |
13:12 |
bos20k |
Oh, so activating all the triggers there... |
13:13 |
bos20k |
Hmmm, so like the 901$s update but not for only those with a source set. |
08:52 |
jeff |
Guest995: how do you generally deliver MARC records to customer libraries? does the library fetch them from you via FTP? SFTP? logging into a web interface? email? |
08:52 |
graced |
Guest995: well that smells like something we could probably point you in the right direction for... but if you need more help I'd come back into channel after say 10:30am when the bulk of the developers are around |
08:53 |
Dyrcona |
jeff | Guest995: There is the new e-book API that jeffdavis has been working on. That sounds more like what you're after. |
08:53 |
Dyrcona |
It requires the ingest of MARC records, though. |
08:54 |
Guest995 |
Oh, good to know. We have a small interface where customers can come to download their records for their specific content sets. We provide them mrc files and xml files |
08:54 |
Dyrcona |
It's designed to reach out to the vendors' API and get the circulation status of electronic items, and eventually allow the patron to circulatet, etc. from the PAC. |
08:54 |
jeff |
Dyrcona: Since the biblioboard content doesn't involve circs, and the ebook APIs for OverDrive / OneclickDigital both depend on MARC records being loaded via normal means, I'm not sure the ebook APIs help here. |
11:35 |
Dyrcona |
pinesol_green: Damned, Yankee! |
11:35 |
pinesol_green |
Factoid 'Damned, Yankee!' not found |
11:35 |
pinesol_green |
Dyrcona: Mr. Spock: Something fascinating just happened. |
11:37 |
Dyrcona |
marc? |
11:37 |
Dyrcona |
No. I guess that's not how it works. |
11:38 |
Dyrcona |
Already, someone is reporting cataloging bugs to me in the webstaff client. |
11:39 |
Dyrcona |
I only finished the installation about 45 minutes ago. |
15:18 |
Dyrcona |
I've been given a screen shot with FILE_UPLOAD_ERROR. |
15:19 |
Dyrcona |
That occurs 1 time in the code in a function call upload_files, which actually appears to run after the file is uploaded (by Vandelay?). |
15:21 |
kmlussier |
Yeah. I've been able to find a more useful error in the logs when I've seen that. |
15:22 |
Dyrcona |
Well, it is supposed to log unable to read MARC file $filename, but I can't find that anywhere. |
15:23 |
* Dyrcona |
thinks his logging is still broken, despite JBoyer's help. |
15:23 |
Dyrcona |
I was looking for other messages last week and couldn't find them, either. |
15:24 |
Dyrcona |
So, I'm asking for hints of what else to look for. |
15:46 |
Dyrcona |
What I don't know is the whole key to look up in memcached. |
15:47 |
Dyrcona |
And the servers both have about 2GB cached with over 200,000 items each. |
15:47 |
JBoyer |
Hmm. Things have to be done on both the sending and receiving machines, if that helps. But yeah, if your primary issue is related to Acq or EDI I'm not much help there. |
15:48 |
Dyrcona |
It's acq creating a po/pikclist from a MARC file upload. I'm told that a couple of libraries report often having to do it more than once lately to get it to work. |
15:48 |
|
wsmoak joined #evergreen |
15:50 |
JBoyer |
I could see how that could potentially point to memcache. Do you have all of the same servers listed in the same order everywhere you specify memcache connections? Depending on how and when things load them and then process them, this could happen: |
15:51 |
Dyrcona |
They should be. I use the same config files everywhere, AFAIK. Unless someone has been messing with them behind my back or some change didn't make it everywhere in the recent server migration. |
11:28 |
berick |
not running the targeter during the normal pull list time seems like a really good idea. |
11:29 |
miker |
it makes pulling holds a little more predictable |
11:32 |
Bmagic |
dbs Dyrcona jeff - this is specifically the bash command? Or exporting via vandelay? |
11:33 |
Dyrcona |
Bmagic: I have no idea about Vandelay, but the problem seems to involve MARC::Record somehow. |
11:33 |
Dyrcona |
Well, both problems. |
11:34 |
Bmagic |
alright |
11:36 |
Bmagic |
I have a handful of custom perl scripts that extract the marc on a regular basis on 16.04. Let me take a look at the one from March |
12:47 |
dbs |
http://search.cpan.org/~gmcharlt/MARC-XML-1.0.3/lib/MARC/File/XML.pm#new_from_xml([$encoding,_$format]) suggests its okay, I think |
12:51 |
jeff |
dbs: from my read of the MARC::Record::XML function's documentation, it's purely a "give me a record with this encoding" thing. While I have encountered "MARC-8 encoded MARCXML" at least once in the wild, it isn't something that I think anyone wants to encourage as being a Thing. :-) |
13:14 |
Dyrcona |
I'd like to point out that this started being a problem at Perl 5.20 or so. |
13:14 |
Dyrcona |
I think MARC::Record and/or MARC::Charset need fixes, not marc_export. |
13:15 |
Dyrcona |
Encode.pm has likely changed on us, again. |
13:17 |
dbs |
Dyrcona: I'm on ubuntu 14.04 with perl 5.18 fwiw |
13:20 |
Dyrcona |
I never noticed it on 14.04, but doesn't mean it didn't happen and I was unaware. |
14:10 |
jeff |
i haven't tested to see how yaz tools handle it |
14:11 |
jeff |
oh, nevermind -- outstanding pull request from tsbere, actually: https://github.com/perl4lib/marc-perl/pull/4 |
14:12 |
jeff |
though there's something else similar that i saw elsewhere... hrm. |
14:14 |
Dyrcona |
Writing your MARC record splitter in Perl is remarkably simple. |
14:14 |
Dyrcona |
I keep words... :) |
14:14 |
jeff |
and this: https://rt.cpan.org/Public/Bug/Display.html?id=70169 |
14:14 |
Dyrcona |
Anyway, since I'm messing with marc_export stuff lately, I'd like to make some improvements. |
12:34 |
dbwells |
berick: The only thing I wonder off the top of my head is whether restoring overdues would work properly if you go that route. If it doesn't, that is a bug in itself, but this route would give it a much wider path to surface. |
12:35 |
berick |
thanks dbwells, I'll give it a shot |
12:38 |
Dyrcona |
Has anyone successfully used marc_export on Debian 8 Jessie or Ubuntu 16.04? |
12:39 |
Dyrcona |
I'm seeing an issue with looping over bre output and putting US MARC into a file on both of those distros. |
12:39 |
Dyrcona |
It's just a loop of fretchrow_harshref, make a MARC::Record object, and write it to a file as US MARC. |
12:40 |
Dyrcona |
The program eats all the RAM on the VM before any records are written. |
12:42 |
Dyrcona |
fretchrow_harshref....Right, Raggie? Right on, Scoob! |
12:42 |
berick |
dude, your harshref'ing my mellow |
13:25 |
dbwells |
berick: The case of switching voids to adjustments for lost item overdues was not functionally necessary. It was an attempt to take advantage of a richer set of tools to hopefully, eventually, actually have *more* clarity. A guy can dream, can't he? |
13:28 |
berick |
dbwells: also good to know. and I agree in principle. it's better. |
14:12 |
|
NawJo joined #evergreen |
14:13 |
Dyrcona |
Bmagic: Have you encountered any issues with MARC on Ubuntu 16.04? Am I right you're running Ubuntu 16.04? |
14:13 |
Dyrcona |
csharp: I guess the same question goes to you, too. |
14:14 |
Dyrcona |
jeff and I are poking and it's looking like issues with Encode.pm at the moment. |
14:15 |
Dyrcona |
jeff is looking at Debian 8, and sees similar things, but slightly less bad in some ways. :) |
14:21 |
Dyrcona |
That's one I see that jeff doesn't. |
14:22 |
Dyrcona |
If I dump xml and then convert to usmarc with yaz-macdump, things are better. |
14:22 |
Dyrcona |
Though some of the records fail to parse. |
14:22 |
Dyrcona |
So, looking like length calculations and Encode.pm in MARC::Record, maybe... |
14:25 |
kmlussier |
Is this related at all to bug 1584891 ? |
14:25 |
pinesol_green |
Launchpad bug 1584891 in Evergreen 2.11 "marc_export -i gives incorrect record length in the leader when call numbers include UTF8 characters" [Undecided,Fix committed] https://launchpad.net/bugs/1584891 |
14:26 |
* kmlussier |
has nothing else to offer, but saw similar words in the bug report. |
14:28 |
Dyrcona |
Well, it could be, but I've seen really nutty behavior in my Boopsie extract program. |
14:28 |
Dyrcona |
It just uses MARC::Record on the out put of a database query. |
14:30 |
Dyrcona |
Actually, no. It has to be something else. |
14:31 |
Dyrcona |
I'm using master fetched as a couple of hours ago. |
14:32 |
kmlussier |
OK, worth a shot. I'll go back to figuring out why I haven't received e-mail all day. |
15:40 |
Dyrcona |
It's here if anyone wants to poke it, which I doubt: https://github.com/Dyrcona/boopsie |
15:40 |
berick |
cure your woopsies with a dash of Boopsie |
15:41 |
csharp |
berick++ |
15:41 |
Dyrcona |
I still suspect that there are issues with Encode and/or MARC::Record and Perl 5.22 on Ubuntu 16.04. |
15:41 |
Dyrcona |
berick++ :) |
15:41 |
berick |
it's fun not to work |
15:42 |
Dyrcona |
I'll try it with concerto data later. We have some French and Italian records with accents. |
10:39 |
kmlussier |
Would anyone be willing to look at bug 1661747 for JBoyer today? |
10:39 |
pinesol_green |
Launchpad bug 1661747 in Evergreen "Add get_org_unit_ancestor_at_depth to action trigger reactor helpers" [Wishlist,New] https://launchpad.net/bugs/1661747 |
10:40 |
kmlussier |
Dyrcona: Did you ever get any sample ISBN's to test the czech added content module? |
10:41 |
Dyrcona |
kmlussier: Better than that. They sent me some MARC records, that I admittedly have not looked at, yet. |
10:41 |
Dyrcona |
I've been doing C/W MARS stuff this morning. |
10:41 |
kmlussier |
Dyrcona: Do you think you'll have a chance to look at it today? If not, maybe you can send them my way and I can give it a try. |
10:42 |
Dyrcona |
Yes, I was just about to say that I have all I need and should be able to look at it today. |
14:11 |
JBoyer |
Shame xslt 2.0 isn't more usable, some of the built in funcs seem very handy. :/ |
14:11 |
JBoyer |
kmlussier, a newer MODS transform means less custom edits on my end! :D |
14:11 |
Dyrcona |
berick: I'll have to see if I can find the original from 1929 on youtube later. |
14:11 |
Dyrcona |
dbs: The marc looks good and is entityized. |
14:11 |
kmlussier |
JBoyer: Fair enough then. |
14:12 |
JBoyer |
(though there are a couple things that we want to display that I'm pretty sure will always be locally customized. :/ ) |
14:12 |
Dyrcona |
Bmagic: You're on pg 9.5. Have you had any issue with targeted uris? |
14:33 |
Dyrcona |
So, I'm ready to sign off on it. kmlussier should I just push it? |
14:33 |
kmlussier |
Dyrcona: Yes, please! |
14:33 |
kmlussier |
Dyrcona++ |
14:34 |
Dyrcona |
My lack of skill with the MARC editor notwithstanding. :) |
14:34 |
kmlussier |
And I get to add another new code contributor to the Evergreen wiki today. :) |
14:43 |
|
mmorgan1 joined #evergreen |
14:50 |
Dyrcona |
Y'know what. I think someone else should look at the code changes. |
17:58 |
Dyrcona |
but yeah, berick, that is cool. |
17:58 |
Dyrcona |
After I said I wished it had that feature, I thought I'd just give it a try and it worked. |
17:59 |
|
_adb left #evergreen |
17:59 |
Dyrcona |
And rather than run make ilscore-install in Open-ILS/src, I think I'll copy the marc templates with cp.... |
18:01 |
Dyrcona |
My day has gone on too long already. I should quit. |
18:11 |
Dyrcona |
Ok. I'm out for now. May be back over the weekend. Peace, everyone! |
18:59 |
|
sandbergja joined #evergreen |
15:57 |
|
kenstir joined #evergreen |
16:06 |
kenstir |
How do I query the 856 field via OSRF? In the Android app, I provide an "online access" button for electronic resources, and "place hold" for others. I want to reliably determine whether a record is an online resource and if so the URL. I have been using "open-ils.search.biblio.record.mods_slim.retrieve", but it is not working in some cases. |
16:15 |
Dyrcona |
mods won't include the 856, I don't think. |
16:16 |
Dyrcona |
You could retrieve the MARC of the record and parse that. |
16:20 |
kenstir |
and can you lend me a clue as to how to retrieve the MARC of the record given a record ID? |
16:23 |
kenstir |
I am grepping through Open-ILS/src/perlmods for all methods with "marc" in the name but so far this is not fruitful. |
16:29 |
Dyrcona |
You can retrieve the bre object via pcrud. The marc field will have the marcxml representation of the record. |
16:32 |
kenstir |
Thanks, that's a great lead! |
16:33 |
Dyrcona |
You can also try open-ils.supercat.record.marcxml.retrieve |
16:34 |
Dyrcona |
I'm having fun trying to get FreeBSD syslog to route messages through a script... |
15:46 |
phasefx |
yeah, marc_cleanup throws out empty tags |
15:46 |
phasefx |
brb |
15:48 |
jeff |
phasefx: yaz-marcdump converting to utf8 was how i ended up with an empty 020, yeah. |
15:48 |
Dyrcona |
I usually write my own stuff in Perl and discard any records that MARC::File::XML doesn't like. |
15:48 |
jeff |
yaz-marcdump -v -f MARC-8 -t UTF-8 -o marc -l 9=97 input.mrc > output.utf8.mrc |
15:49 |
Dyrcona |
Oddly enough, it looks like I'll get through this again for real this summer. |
15:49 |
Dyrcona |
We'll have a new branch added with records from Voyager or something like that. |
15:50 |
Dyrcona |
I say oddly 'cause I only just found out about the time this conversation started. |
15:51 |
Dyrcona |
I notice that MARCEdit (is that the program?) can also tolerate some junk in the MARC that MARC::Record and friends don't like. |
15:52 |
jeff |
i am shocked -- SHOCKED -- to find that there is variance in what different MARC-using software tolerates and emits. |
15:52 |
phasefx |
jeff: fun. My chain looks like this: chardet, yaz-marcdump, marc_cleanup, marc2bre, parallel_pg_loader, split, wrap chunks in BEGIN; -- flag munging COMMIT; sql, parallel, quick_metarecord_map |
15:52 |
Dyrcona |
hah! |
15:56 |
phasefx |
either marc_cleanup is saving me, or I've lucked up on catastrophic-failure-causing records |
15:57 |
Dyrcona |
I deal with catastrophic records by using my own parser for binary records. |
15:57 |
Dyrcona |
Just IO::File with $/ = '\x1f\x1e'; |
15:58 |
Dyrcona |
Put an eval around the MARC::Record->new_from_usmarc() and put any that blow up into a reject file. |
15:58 |
* jeff |
nods |
15:58 |
Dyrcona |
I think I got that delimiter correct. That was from memory. :) |
15:59 |
phasefx |
I think that's what marc_cleanup does |
09:55 |
csharp |
our cataloger has wanted something similar before to identify "incomplete" bibs |
09:56 |
* Dyrcona |
has recently found that left joins to mrfr can be slow, but that's basically how I'd do it. |
09:57 |
Dyrcona |
Oh, and upshot of my Perl threads experiment yesterday is, Don't do it, kids. |
09:58 |
Dyrcona |
MARC::File::XML is not thread safe because Encode.pm is not thread safe. |
09:58 |
Dyrcona |
@blame 9 Encode.pm |
09:58 |
pinesol_green |
Dyrcona: Encode.pm is why we can never have nice things! |
10:20 |
berick |
can anyone point me at the docs that form the basis for the authority control set mappings, specifically LoC? |
10:24 |
jeff |
"You map that way so often. I wonder what your basis for comparison is." |
10:30 |
|
jvwoolf joined #evergreen |
10:42 |
Dyrcona |
berick: You could start here: http://www.loc.gov/marc/authority/ |
10:48 |
berick |
Dyrcona: yeah I started there... did not find what I was looking for. |
10:48 |
|
Christineb joined #evergreen |
10:48 |
berick |
i'm sure it's in there somewhere |
10:55 |
Dyrcona |
berick: I don't think there is a simple document for those. I think it comes right out of the authority field specifications. |
10:57 |
berick |
Dyrcona: i'm happy to dig if I'm digging in the right place. and i'm probably looking right past it, but I can't seem to find an example of bib field X is controlled by authority field Y (for LoC thesaurus) in the LC docs. |
10:58 |
Dyrcona |
berick: Yeah. I thought that is what you're looking for and I don't see it, either. |
10:59 |
Dyrcona |
berick: Mayb this: http://www.loc.gov/marc/authority/ecadcntf.html |
11:00 |
Dyrcona |
Eh, maybe not. That appears to just explain what to do when you're making a link in a controlled subfield. |
11:03 |
Bmagic |
Dyrcona: years ago, I programmed a "multi" thread perl program that used Encode.pm, however, to get around the multi thread issue, I spawned my seperate instances that all wrote back to their own pid file |
11:04 |
Bmagic |
then the master thread periodically checked up on the "threads" by reading that file |
12:52 |
Dyrcona |
@dessert 46 bshum |
12:52 |
* pinesol_green |
grabs some a thick slice of Jubilee Roll for bshum |
12:52 |
Dyrcona |
Meh. The plugin is "too smart." |
12:57 |
Dyrcona |
I was looking at the deprecated function because I was looking at alternatives to trying to lookup the item form in an incoming marc records. |
12:57 |
Dyrcona |
Looks like I'll make an object module to read some data from the database and create lookup tables. |
12:58 |
Dyrcona |
It'll have methods to look up various attributes for a given MARC::Record. |
13:12 |
* kmlussier |
rediscovers bug 1522644 and contemplates raising the discussion on the mailing list again to remove that "Transfer All Title" holds button from the web client. |
13:12 |
pinesol_green |
Launchpad bug 1522644 in Evergreen "webclient: Transfer title holds issues" [Medium,New] https://launchpad.net/bugs/1522644 |
13:14 |
mmorgan |
kmlussier++ |
15:55 |
agoben |
lol, yep! Sorry about that! |
15:58 |
|
mmorgan joined #evergreen |
16:33 |
Dyrcona |
Bit late, I know, but.... |
16:34 |
Dyrcona |
I have a record that blows up in MARC::Batch with this message: utf8 "\xE8" does not map to Unicode at /usr/lib/perl/5.14/Encode.pm line 176. |
16:34 |
Dyrcona |
I already have strict_off(), so I wonder how to trap that error and keep on going. |
16:34 |
Dyrcona |
Maybe I should have a look at Encode.pm. |
16:45 |
|
bmills joined #evergreen |
16:03 |
kmlussier |
JBoyer: I don't know. She didn't say. |
16:03 |
Dyrcona |
I might have. I don't remember what Groton was using before they joined MVLC. |
16:04 |
Dyrcona |
What do ya know... They were on Library.Solution from TLC. |
16:05 |
Dyrcona |
If she can get the records into either binary MARC or MARCXML, they should load. |
16:05 |
kmlussier |
OK, that gives me enough to write a somewhat helpful response. Thanks! |
16:05 |
kmlussier |
dbs++ JBoyer++ Dyrcona++ |
16:06 |
dbs |
kmlussier++ # actually responding |
16:06 |
Dyrcona |
And, here's a quote from the TLC Extraction Services document: The library can extract their own bibliographic MARC records from Library•Solution® using Cataloging Utilities/Extract Records feature. The extracted file is MARC Communication format. |
16:06 |
kmlussier |
dbs: I'm surprised I caught it. I usually just assume those e-mails are spam. |
16:06 |
rhamby |
I've gotten records from TLC before and they were standard bin marc files |
16:07 |
Dyrcona |
rhamby kmlussier: That's what I suspect the user has. They just don't know it. But I don't know anything about it beyond what I was given and what's in the TLC document that I still have. |
16:35 |
Dyrcona |
Turns out Clark was already running, anyway. |
16:35 |
kmlussier |
ssieb: I've only had knowledge of a few systems, but, in my experience, the holdings information is usually in one field. The field itself may vary from system to system, as well as the subfields used particular pieces of data. But, like I said, my experience is limited. |
16:36 |
dbs |
kmlussier: easy for me to come up with ideas, then run away and hope someone else implements them; the hard part is actually making it happen |
16:36 |
Dyrcona |
ssieb: The records are in a format called MARC. I has numeric fields and usually alphabetic subfields. So everything will be in the same field, but different pieces will be in different subfields. |
16:37 |
ssieb |
yes, I'm aware of that. I've learned far more about this stuff than I really wanted to. :-) |
16:37 |
Dyrcona |
:) That seems to be what usually happens. |
16:37 |
kmlussier |
ssieb: Welcome to the club! :) |
16:39 |
ssieb |
It's not showing that it finds the barcode, but maybe that's because that would be attached to the items and those are failing on import. |
16:40 |
Dyrcona |
ssieb: That's what I suspect. |
16:40 |
kmlussier |
ssieb: Yes, so if the circ modifier stops the items from importing, then you won't be able to import any of that item information. |
16:40 |
Dyrcona |
kmlussier mentioned a program called MARCEDit earlier. You can use it to make batches to a file full of MARC records. |
16:40 |
dbwells |
Dyrcona: now action_triggers in dev, there's the real trouble :) |
16:41 |
Dyrcona |
You could use it to add the same circ modifier to all of the records and then reimport it. |
16:41 |
Dyrcona |
dbwells: Not if you redirect all of your server's email to /dev/null. ;) |
13:29 |
BookThief |
"Suspense," How would I do so? Any help is appreciated! |
13:30 |
Dyrcona |
BookThief: What version of Evergreen? |
13:31 |
BookThief |
2.8.3, I believe. |
13:31 |
Dyrcona |
Evergreen adds genre indexing from the 655 MARC field. |
13:31 |
Dyrcona |
Oops, I meant to say Evergreen 2.10 adds it. |
13:32 |
BookThief |
Is it not possible on 2.8.3? |
13:32 |
|
mrpeters joined #evergreen |
13:33 |
Dyrcona |
If you want to use the 655 stuff, just run the 0952.data.genre-indexing.sql on your database and then reingest your records. |
13:38 |
Dyrcona |
If you want a custom index, I suggest visiting the dokuwiki http://wiki.evergreen-ils.org/doku.php?id=start and searching for indexing or custom search index in the search box. |
14:00 |
Dyrcona |
@marc 655 |
14:00 |
pinesol_green |
Dyrcona: Terms indicating the genre, form, and/or physical characteristics of the materials being described. A genre term designates the style or technique of the intellectual content of textual materials or, for graphic materials, aspects such as vantage point, intended purpose, or method of representation. A form term designates historically and functionally specific kinds of materials distinguished (1 more message) |
14:00 |
Dyrcona |
@more |
14:00 |
pinesol_green |
Dyrcona: by their physical character, the subject of their intellectual content, or the order of information within them. Physical characteristic terms designate historically and functionally specific kinds of materials as distinguished by an examination of their physical character, subject of their intellectual content, or the order of information with them. (Repeatable) [a,b,c,v,x,y,z,2,3,5,6,8] |
08:46 |
|
ethomsen1 joined #evergreen |
08:49 |
|
Dyrcona joined #evergreen |
08:50 |
|
krvmga joined #evergreen |
09:36 |
Dyrcona |
Are there any good examples of queries using xpath in the database to extract fields from bre.marc? |
09:37 |
jeff |
if you don't care about the fields potentially being normalized, i go with metabib.real_full_rec |
09:37 |
miker |
Dyrcona: what are you looking for? there are plenty in the code... |
09:38 |
jeff |
but since you're asking, you probably care about the values being normalized (i.e., don't want the normalization) |
09:48 |
|
yboston joined #evergreen |
09:48 |
Dyrcona |
jeff: The staff client doesn't handle now holdable precats the way I'd like. |
09:49 |
Dyrcona |
I was doing basically what miker posted, 'cept I had "field" not "datafield." |
09:49 |
Dyrcona |
'Cause y'know, I had just used MARC::Record->field() |
09:51 |
miker |
I wish the perl module had always separated those ... the structure is different, and the "tag < 010" rule is just a USMARC thing |
09:52 |
|
mllewellyn joined #evergreen |
09:54 |
jeff |
Dyrcona: since it's NCIP, mostly i don't care about the staff client, but... in what way would you like them to behave that they do not currently? |
11:19 |
lualaba |
how can be there bad records? also i try to download here http://wiki.evergreen-ils.org/doku.php?id=evergreen-admin:importing:bibrecords |
11:20 |
Dyrcona |
lualaba: You did follow the suggests of the bold text in read at the top of the second URL, right? |
11:20 |
* Dyrcona |
cannot type today. |
11:21 |
Dyrcona |
lualaba: Also, the Gutenberg records are already binary MARC, you don't have to do anything to convert them, except bunzip them. |
11:21 |
lualaba |
note that is older instruction? |
11:22 |
lualaba |
and how to import in db binary MArc records? |
11:23 |
Dyrcona |
lualaba: Try Vandelay. I was able to import the Gutenberg records through Vandelay last time I tried in 2012. |
11:24 |
Dyrcona |
The client timed out, but the import eventually finished. |
11:24 |
lualaba |
i use version 2.8.1 |
11:25 |
Dyrcona |
Vandelay is the "MARC Batch Import/Export" option on the cataloging menu. |
11:25 |
Dyrcona |
The documentation should tell you all you need to know. |
11:25 |
lualaba |
i know but there is any limit or need time? |
11:26 |
lualaba |
after 1000 records i don't see any progress |
08:13 |
jboyer-isl |
I'd @quote that if I could. |
08:39 |
|
rlefaive joined #evergreen |
08:47 |
|
Dyrcona joined #evergreen |
08:53 |
Dyrcona |
Well, that's nice: Argument "The" isn't numeric in integer division (/) at /usr/share/perl5/MARC/Record.pm line 407. |
08:54 |
Dyrcona |
That's not from MARC export, so I guess I'll need to trap that and see what record produced it. |
08:54 |
|
maryj joined #evergreen |
08:57 |
Dyrcona |
Hmm. Might be from marc_export after all.... |
09:05 |
Dyrcona |
So, coming from an insert_grouped_field call in marc_export.... |
09:06 |
Dyrcona |
Ah, when adding items on line 473. |
09:06 |
Dyrcona |
The record must have a bad field. |
09:16 |
Dyrcona |
Warning from bibliographic record 1635630: Argument "The" isn't numeric in integer division (/) at /usr/share/perl5/MARC/Record.pm line 407. |
09:16 |
Dyrcona |
is a lot more useful. :) |
09:17 |
jboyer-isl |
What is it doing that there would be any math done at all, never mind math done on fields that haven't been checked for numeric-ness? |
09:18 |
jboyer-isl |
(I suppose I could look that up, what with the line numbers right there.) |
09:18 |
Dyrcona |
Line 407 of MARC::Record is in the insert_grouped_fields method. |
09:18 |
Dyrcona |
It is doing the math to determine where the inserted field(s) belong(s). |
09:19 |
Dyrcona |
That record has a summary field (should probably be a 520?) with a tag of 'The'. |
09:20 |
Dyrcona |
@marc 520 |
09:20 |
pinesol_green |
Dyrcona: Unformatted information that describes the scope and general contents of the materials. This could be a summary, abstract, annotation, review, or only a phrase describing the material. (Repeatable) [a,b,u,3,6,8] |
09:20 |
Dyrcona |
Yep, that looks to me what it ought to be, but I'll let the catalogers determine that. |
09:21 |
jeff |
@marc The |
09:23 |
Dyrcona |
"Wild Snow Sprout," eh.... |
09:23 |
* Dyrcona |
looks at the rain out the window. |
09:25 |
Dyrcona |
And RT ticket 5144 created.... |
09:25 |
Dyrcona |
Hmm. I made a branch to make that change. Maybe I should trap warnings around all calls to MARC::Record in marc_export and then make a LP bug? |
09:29 |
Dyrcona |
Oh, I see what happened.... |
09:29 |
Dyrcona |
The tag is The |
09:29 |
Dyrcona |
in1 is A and ind2 is d |
09:32 |
Dyrcona |
It's what we call a "brief" record. It will get overlaid from OCLC eventually. |
09:32 |
|
mrpeters joined #evergreen |
09:32 |
Dyrcona |
It has the local 590. |
09:33 |
Dyrcona |
@marc 550 |
09:33 |
pinesol_green |
Dyrcona: Information about the current and former issuing bodies of a continuing resource. (Repeatable) [a,6,8] |
09:33 |
Dyrcona |
@marc 650 |
09:33 |
pinesol_green |
Dyrcona: A subject added entry in which the entry element is a topical term. (Repeatable) [a,b,c,d,e,v,x,y,z,2,3,4,6,8] |
09:33 |
Dyrcona |
@marc 500 |
09:33 |
pinesol_green |
Dyrcona: General information for which a specialized 5XX note field has not been defined. (Repeatable) [a,3,5,6,8] |
09:34 |
* Dyrcona |
is trying to remember what field the titles of a compilation go into. |
09:34 |
Dyrcona |
That's the field this should be. |
09:39 |
|
maryj_ joined #evergreen |
09:40 |
Dyrcona |
csharp++ |
09:40 |
Dyrcona |
heh. |
09:40 |
Dyrcona |
@blame [marc The] |
09:40 |
pinesol_green |
Dyrcona: unknown tag The is why we can never have nice things! |
09:40 |
tsbere |
@blame [quote random] |
09:40 |
pinesol_green |
tsbere: Quote #62: "< Dyrcona> À propos a migration from TLC: If you have a column called TOTALINHOUSEUSES you should also have TOTALOUTHOUSEUSES must eat cottage cheese! for symmetry's sake." (added by csharp at 11:49 AM, July 22, 2013) |
12:40 |
Dyrcona |
I imagine the author pwns one fo these: https://plus.google.com/+ReverendEricHa/posts/Qn4aTEytdqn?pid=6231152009976367506&oid=103046039519355433778 |
12:41 |
jeff |
csharp: actually, you'll want to add a criteria to attempt to avoid invalid xml. |
12:44 |
jeff |
SELECT id FROM biblio.record_entry WHERE xml_is_well_formed(marc) AND xpath_exists('//marc:record/marc:datafield[@tag="505"]/marc:subfield[@code="t"]', marc::XML, ARRAY[ARRAY['marc', 'http://www.loc.gov/MARC21/slim']]); |
12:44 |
Dyrcona |
jeff: have you seen much invalid xml in your marc records? |
12:44 |
jeff |
found at least one just now. |
12:46 |
Dyrcona |
I'm running select id from biblio.record_entry where not xml_is_well_formed(marc) on my development database right now to see what I find. |
12:46 |
phasefx |
hrmm, there should be a a_marcxml_is_well_formed trigger on bre |
12:47 |
jeff |
immediate 500 error on supercat marcxml retrieval, mods takes a bit to return an empty collection, standard catalog page returns quickly (but mostly broken), and the MARC Record view in the catalog seems to take a while too. |
12:48 |
jeff |
delays might be unrelated, but i wonder if something gets... stuck. |