Evergreen ILS Website

IRC log for #evergreen, 2014-05-09

| Channels | #evergreen index | Today | | Search | Google Search | Plain-Text | summary | Join Webchat

All times shown according to the server's local time.

Time Nick Message
00:08 dac joined #evergreen
00:22 zerick joined #evergreen
00:31 jeff joined #evergreen
00:31 jeff joined #evergreen
00:56 remingtron_ joined #evergreen
04:55 pinesol_green Incoming from qatests: Test Success - http://testing.evergreen-ils.org/~live/test.html <http://testing.evergreen-ils.org/~live/test.html>
07:59 jboyer-laptaupe joined #evergreen
07:59 akilsdonk joined #evergreen
08:14 collum joined #evergreen
08:23 kmlussier joined #evergreen
08:25 tspindler joined #evergreen
08:30 rjackson-isl joined #evergreen
08:33 Shae joined #evergreen
08:39 ericar joined #evergreen
08:42 montgoc1 joined #evergreen
08:45 mmorgan joined #evergreen
09:16 Dyrcona joined #evergreen
09:42 berick dbs: as a thanks for hotkeys, which works great, here's a picture of 22 staff web unit tests completing successfully: http://bill-dev2.esilibrary.com/~berick/karma.png
09:42 berick need to figure out how to make it log the name of each test..
09:43 dbs berick++ # the rest of the stack you're putting together fits with all of the best practices I've seen
09:44 * dbs was just signing in to karma-increment berick
09:44 dbs staff web unit tests are a huge bonus!
09:45 csharp berick++\
09:45 csharp berick++
09:50 berick may be excessive, but kinda cool http://bill-dev2.esilibrary.com/~berick/karma2.png
09:51 dbs The verbose output is a good, reassuring option :)
10:04 pastebot "csharp" at 64.57.241.14 pasted "fatal yaz-marcdump errors for record(s) that is(are) apparently unfindable" (19 lines) at http://paste.evergreen-ils.org/26
10:05 csharp ^^ could someone with some pattern matching-fu, please assist? ;-)
10:05 csharp I don't know how it's possible that what appears to be an ISBN in the yaz-marcdump output would not be in a MARC record somewhere
10:07 jeff 012001548054 is an item barcode there.
10:07 jeff maybe.
10:08 csharp ah - so this could be in the holdings output (852?)
10:09 kmlussier Does anyone else want to add their availability to https://doodle.com/ncdke3xitdzvwfwa before we pick a date for the May dev meeting?
10:09 csharp oh - of course
10:09 csharp so that's why it's not in the MARC record
10:09 csharp jeff++ # being my rubber duck ;-)
10:10 csharp kmlussier++ # wranglin'
10:10 jeff isn't not rubber ducking if i gave you the answer. ;-)
10:11 jeff csharp: is that in the dump from yesterday?
10:11 berick oh no, the rubber duck came to life!
10:11 berick run
10:12 jeff QUACK.
10:13 jeff csharp: but if my answer was incorrect, then it's all on you -- you made that stuff up. :-)
10:13 * berick builds Pacific Rim-esque Go-Bot for battling the duck
10:16 csharp jeff: heh ;-)
10:16 csharp berick++ # EVERGREEN KAIJU
10:17 * csharp recently watched MST3K's "Gamera" episode - very funny
10:18 * berick adds to his netflix list
10:18 jeff mst3k on netflix?
10:19 berick indeed
10:19 jeff there are three hits for the search term mst3k gamera
10:22 mceraso joined #evergreen
10:22 csharp aha! http://gapines.org/eg/opac/record/8968 - 1700+ copies attached
10:22 bshum joined #evergreen
10:22 csharp so I guess the "record exceeds byte length of 99999" warnings don't prevent creation of the invalid MARC record
10:24 dbs csharp: MARCXML doesn't have such a restriction, maybe that's a factor?
10:25 dbs s/MARC must die/MARC binary must die/
10:26 csharp hmm - yeah - I'm creating USMARC files
10:28 Dyrcona csharp: Your guess is correct. How the other end handles those files is another question.
10:28 Dyrcona csharp: Anything using MARC::Record has no problem with them.
10:30 csharp my goal here is to chop my 1.7-million-record file into 90K chunks for OCLC upload
10:30 csharp I knew yaz-marcdump has that nice "split" feature, so I was using that
10:31 csharp I suspect OCLC's whatever-they-use will probably choke too, though
10:31 jeff csharp: Argument "copies" isn't numeric in addition (+)
10:31 jeff :-)
10:31 Dyrcona I don't know. I've never sent files to OCLC.
10:31 bshum joined #evergreen
10:32 Dyrcona I can say that different EBSCO products (divisions?) seem to use different software with different limitations.
10:33 Dyrcona Novelist, for instance, has no problem with the size of our records, but another product did.
10:33 csharp gotcha
10:33 * dbs HATES the OCLC batch upload process. although at least they have a web-based upload option now.
10:34 csharp dbs: web-based as in other than FTP? or do you mean FTP?
10:34 * csharp literally LOLed at the "how to upload files to us" sheet they provided
10:36 Dyrcona PUT /pub/somefile.txt HTTP/1.1
10:36 csharp oh
10:36 Dyrcona Just free forming there. ;)
10:36 csharp gotcha ;-)
10:37 Dyrcona Yeah, I enjoy it when they go into step-by-step instructions on using the Windows command line FTP program.
10:37 csharp most FTP instructions I've seen are 1) create a file 2) FTP it to us at this address with these credentials, so OCLC's multistep process took me aback
10:37 eeevil csharp: one way that's handled is by duplicating the bib over and over with chunks of the holdings that fit within the 100k limit
10:37 Dyrcona "That's nice. I'm using Net::FTP in Perl."
10:38 csharp Dyrcona: yeah, OCLC's instructions assume you're manually doing it in Windows
10:38 eeevil "that" being "too many copies for one record"
10:38 csharp eeevil: that's a thought
10:38 Dyrcona eeevil: I almsot said that the marc_export program could be modified to do that, but I bit my fingers, 'cause I don't want to be the one to implement it. :)
10:38 csharp Dyrcona: haha
10:39 Dyrcona So, I didn't say that, you did. :)
10:39 csharp well, what I've got now is a 3GB USMARC binary file that took over 2 days to create (using "old" marc_export)
10:40 dbs csharp: OCLC calls it PSWeb - http://oclc.org/content/dam/support/batchloa​d/documentation/using/PSWebinstructions.pdf
10:40 Dyrcona csharp: Hope they have a 64bit file system, or the 32bit file offset pointer is unsigned. :)
10:40 csharp I'm asking if OCLC will accept MARCMXL - otherwise I need to somehow ferret out the too-big records
10:40 berick joined #evergreen
10:40 dbs _of course_ the instructions for using their web-based process are in PDF
10:41 Dyrcona csharp: I send the too-big binary records. I let the vendor deal with it. They demanded all of our copy holdings.
10:41 csharp dbs++
10:41 Dyrcona But, dunno how OCLC will deal with that.
10:42 jeff joined #evergreen
10:42 jeff joined #evergreen
10:42 csharp I guess I can also just to "for file in list_of_90K_bib_ids*.out; do cat $file | marc_export ... ; done;"
10:42 csharp s/to/do/
10:42 Dyrcona In my experience, most vendors either don't know what MARCXML is or don't want it, preferring binary.
10:42 csharp yeah, same here
10:44 Dyrcona The prevailing attitude seems to be: We "finished" this software in 1995. Why should we have to modify it now?
10:44 Dyrcona j/k
10:44 csharp Dyrcona: yeah - seeing a lot of that (esp. around EDI stuff)
10:44 Dyrcona or "Ha! Ha! Only serious."
10:44 csharp Dyrcona++
10:45 dbs csharp: your bash loop is probably the best way to go in the short term
10:45 dbs getting_it_done++
10:45 * Dyrcona agrees with dbs.
10:45 csharp dbs: yeah, I just settled on that, though I may upgrade to new marc_export for speed benefits ;-)
10:46 Dyrcona For one of our EBSCO exports, they asked me to split our records into two files, so I do something like the above, and dump them in parallel.
10:46 csharp parallelism++
10:47 Dyrcona It was that project that prompted the development of the new marc_export.
10:49 yboston joined #evergreen
10:56 csharp Dyrcona: if I replace the marc_export.in script itself, that's enough, right?  no changes in Perl libraries or anything?
10:57 Dyrcona csharp: I forgot that Fieldmapper.pm also changes to add a from hashref method to make a Fieldmapped object.
10:58 Dyrcona So, you have to get the new Fieldmapper.pm also.
10:58 Dyrcona If you run this on a utility server or set up a "client" machine that doesn't run services, you don't have to upgrade or restart production.
10:59 csharp excellent - thanks
10:59 Dyrcona yw
10:59 dbs Dyrcona++
11:01 * Dyrcona checks his upgrade checklist again.
11:16 vlewis joined #evergreen
11:21 tspindler joined #evergreen
11:22 DPearl joined #evergreen
11:26 csharp ooh - I like the --since parameter - that will come in handy
11:45 csharp Dyrcona: so far I'm seeing "Waiting for Input" and no other screen messages - the process is running with a lot of CPU
11:45 Dyrcona Yep. That's normal.
11:45 csharp is that correct?  I know the progress messages are gone
11:45 csharp ok
11:46 Dyrcona The waiting for input should probably go. I thought it was a good idea if someone didn't give it command line options, but that was in August or September.
11:47 dbs That's a long time to wait for input.
12:14 Dyrcona :)
12:14 Dyrcona dbs++
12:15 Dyrcona Well, I thought it would be a good signal to the user that the program was waiting to receive bib ids.
12:15 Dyrcona I never did figure out how to make the message go away if the command line options didn't require a list of ids, or it already had standard input.
12:16 Dyrcona And then I just kind of forgot about it.
12:52 bmills joined #evergreen
13:18 kmlussier1 joined #evergreen
13:42 * jeff injected logic in our collection hq extracts to create a fake branch made up of items from a particular branch with a particular stat cat
13:42 * jeff wonders if anyone else has had similar needs
13:44 jeff also looking at "exclude items with this stat cat value" logic. i suspect that will be more widely useful and also easier to just incorporate as an optional thing in the collectionhq schema.
13:56 DPearl Question about fm_IDL.xml.  The "main" one is typically in (say) /openils/conf/fm_IDL.xml .   There is another one, often overlooked (mea culpa) in /openils/var/web/reports.   It looks like to make changes to the former, Open-ILS/examples/fm_IDL.xml is the place.  Where are changes made to the latter?
13:57 dbs DPearl: the latter is automatically generated at build time from the former
13:57 DPearl dbs: Thanks. That is convenient.
13:58 dbs If you're not using multiple locales, I think it can just be symlinked
14:01 jihpringle joined #evergreen
14:13 bmills joined #evergreen
14:19 hbrennan joined #evergreen
14:28 kmlussier jeff: I'm looking at the changes that came with commit 07e6490
14:28 pinesol_green [evergreen|Jeff Godin] Show jacket on record page even when no isbn/upc - <http://git.evergreen-ils.org/?p=​Evergreen.git;a=commit;h=07e6490>
14:29 kmlussier We have two consortia using Content Cafe on 2.4 that are getting ready to upgrade to 2.5. It looks like that commit took away their ability to retrieve DVD covers by UPC.
14:32 kmlussier But I'm guessing re-adding that code might break something else?
14:32 wjr_ maybe you could put that back in but just remove the if/end bits. that way ident gets set
14:33 * kmlussier can try that.
14:33 wjr_ the reason for that change was we upload covers manually for records that have no isbn/upc, and that block was preventing the html inside it from displaying if the record had no isbn/upc
14:34 wjr_ but ident is possibly used elsewhere (though syndetics still works)
14:34 wjr_ jeff's afk right now, i'm sure he will reply too when he's back ;)
14:37 kmlussier Sure, that makes sense. I noticed when looking at the LP bug yesterday that there was code to help Syndetics with the UPC stuff, but I guess we must be in the minority using Content Cafe.
14:44 jeff content cafe and syndetics jackets are somewhat different beasts, so there wasn't anything required to make Content Cafe able to fetch jackets by UPC -- but it was not my intent to break Content Cafe UPC lookups. Investigating.
14:46 tsbere left #evergreen
14:46 kmlussier jeff: No, I didn't think it was your intent. :)
14:46 jeff heh
14:46 tsbere joined #evergreen
14:49 jeff kmlussier: Are you seeing the issue in search results, record detail pages, or both? Do you have a public catalog URL I can poke at that is configured for content cafe?
14:50 jeff oh, i think I see the issue.
14:50 kmlussier Both. We have a public URL, but I think they have currently customized their summary.tt2 to add back the code that was there.
14:50 kmlussier So it is working at the moment.
14:51 jeff What is the URL? If they've changed it to fix, it would probably confirm my thinking as to what the issue is.
14:52 kmlussier Here is one of the records that was broken with the stock code. https://egtraining.noblenet.org​/eg/opac/record/2098711?locg=1
14:52 jeff kmlussier++ thanks!
14:52 kmlussier Wait, that one has an ISBN. I have another I was looking at.
14:53 jeff query kmlussier
14:53 jeff er.
14:53 jeff :P
14:54 kmlussier Here's another example. http://egtraining.noblenet.​org/eg/opac/record/3218810
14:58 jeff yeah, it actually looks like all jacket images might be broken with content cafe and ac-by-record-id.
14:58 jeff thanks for raising the issue and bringing it to my attention.
14:59 jeff if you'd like to open a bug, please do. otherwise, I'll try to do it soon. workaround is to force the templates to use the previous URL format for the jacket images.
14:59 kmlussier OK, I'll open one.
15:00 jeff I'll plan to get a reasonable fix in and ready for the next release.
15:00 kmlussier jeff: Thanks for looking into it!
15:00 jeff kmlussier++ thanks again!
15:15 berick joined #evergreen
15:15 jboyer-laptaupe jeff; speaking of CollectionHQ, have you done any looking into the weekly Holds extracts that they're hoping to start doing?
15:16 jboyer-laptaupe (Speaking of in a temporal sense, loosely)
15:16 jeff jboyer-laptaupe: i have not seen anything about that yet, but will look. i am about to work with them to increase the frequency of our extracts.
15:17 jboyer-laptaupe I was just making sure that we weren't working on the same thing. I'm basically finished with it, just waiting on a couple of questions re: their handwave-y vague spec.
15:17 jboyer-laptaupe I'm surprised they haven't mentioned it to you already, they've had both of our members that use their service bring it up here.
15:18 jeff i may have simply avoided reading their latest release notes.
15:21 jeff jboyer-laptaupe: do you have a link regarding that?
15:22 csharp we only have CollectionHQ at a single library right now
15:23 jboyer-laptaupe Nope, I've not seen any mention of it aside from 2 of our members.
15:24 csharp heh
15:33 DPearl Help! I did a push to the working repository and I want to back out the submissions.  Can I?
15:34 csharp DPearl: do you want to blow away the whole branch? or undo a commit?
15:34 kmlussier I removed something from the working repository once, but I don't remember how I did it.
15:34 * kmlussier isn't very helpful.
15:35 csharp DPearl: this may help http://stackoverflow.com/questions/2003505/how-do​-i-delete-a-git-branch-both-locally-and-remotely
15:35 DPearl csharp: Somehow, some actions I did did a submission;  I think I mis-used cherry-pick.  I want to undo that.  And while I'm at it, I want to remove the branch I submitted to make sure it is complete.
15:36 * csharp accidentally pushed a password once and that's what I did
15:36 DPearl csharp: password!! Nice!
15:36 DPearl csharp: Thanks for the pointer.
15:36 csharp heh - it reminded me how many servers we have to change when that kind of thing happens ;-)
15:38 jeff it's worth noting that the steps in that article are not sufficient in the case of sensitive / private data being committed by mistake.
15:42 jeff (which does not appear to be the case here, so carry on)
16:08 tspindler Dyrcona: I saw you were asking yesterday in IRC about taking electronic payments,  we have been doing it Evergreen since we went live and did it in our previous system
16:09 Dyrcona tspindler: Yep. It was an internal discussion in a central site staff meeting, so I thought I'd ask in IRC what others might be doing.
16:10 Dyrcona a belated jboyer-laptaupe++ for sharing scripts yesterday.
16:11 tspindler FWIW: we use paypal as the processer and the money goes into a deposit account.  Libraries can then use it like a deposit account.  It does create an accounting overhead.
16:12 jboyer-laptaupe Drycona: I hope they can help if that's where you're headed. I know documentation is... sparse.
16:25 DPearl csharp++ Your advice was right on the mark!
16:30 tspindler left #evergreen
16:43 csharp DPearl: great! - happy to help
16:47 berick joined #evergreen
16:49 kmlussier @dessert [somebody]
16:49 * pinesol_green grabs some Cookies and Cream Ice Cream for Mr. Spock: Something fascinating just happened.
16:50 mmorgan kmlussier++
16:50 kmlussier hmm...I guess I did something wrong there.
16:51 mmorgan ...but something fascinating happened
16:51 kmlussier @dessert [someone]
16:51 * pinesol_green grabs some Coconut Cream Cake for frank_____
16:52 kmlussier Feliz Viernes!
16:53 frank_____ yeah Feliz viernes!!!
16:58 kmlussier @coin
16:58 pinesol_green kmlussier: heads
17:05 mmorgan @dessert
17:05 * pinesol_green grabs some Krispy Kreme Donuts for mmorgan
17:05 kmlussier mmorgan: A nice way to start your weekend! :)
17:05 * kmlussier has never had a Krispy Kreme.
17:05 mmorgan yeah
17:06 * mmorgan has had one
17:06 mmorgan ... followed by sugar crash:)
17:08 mmorgan @dessert kmlussier
17:08 * pinesol_green grabs some Sweet Potato Pie for kmlussier
17:08 kmlussier mmorgan: Thank you!
17:11 kmlussier Have a nice weekend everyone!
17:12 hbrennan You too, kmlussier!
17:12 hbrennan *sigh*  Only lunch time in Alaska, and SOOOO sunny out.
17:12 mmorgan @dessert hbrennan
17:12 * pinesol_green grabs some Pineapple Upside Down Cake for hbrennan
17:13 mmorgan That should get you through the afternoon
17:13 hbrennan Ah thanks
17:13 kmlussier mmorgan++ :)
17:15 mmorgan left #evergreen
17:28 pinesol_green Incoming from qatests: Test Success - http://testing.evergreen-ils.org/~live/test.html <http://testing.evergreen-ils.org/~live/test.html>

| Channels | #evergreen index | Today | | Search | Google Search | Plain-Text | summary | Join Webchat