Evergreen ILS Website

Search in #evergreen

Channels | #evergreen index




Results

Result pages: 1 2 3 4 5 6 7

Results for 2017-11-14

15:51 csharp s/quick and handy/super damn slow in your case/
15:51 Dyrcona rsync with --bandwidth= is my best bet.
15:52 Dyrcona Well, at one location the bandwidth is asymmetrical and that's the one that I'd be sending from, so I don't want to crush what others are doing.
15:52 Dyrcona I've had that happen just sending marc records to the masslnc servers for testing.
15:53 Dyrcona So, yeah, gymnastics are required. :)
15:53 Dyrcona Hm... It happened on this testing server again with -j 4.
15:54 Dyrcona I should see if it happens on the development server.

Results for 2017-09-29

10:43 JBoyer @marc w00t?
10:43 pinesol_green JBoyer: unknown tag w00t?
10:44 JBoyer Database is lacking.
10:44 Dyrcona @marc w00 t
10:44 pinesol_green Dyrcona: unknown field/subfield combination (w00/t)
10:44 Dyrcona :)
10:55 csharp @marc 1337

Results for 2017-09-21

11:57 LSachjen joined #evergreen
12:01 Dyrcona Nope. vmbuilder just don't work no more.
12:03 bwicksall Has 2.12.5-3.0-beta1-upgrade-db.sql blown up for anyone with the following? https://pastebin.com/ysHaaBXb
12:12 Dyrcona No, but it looks like you have bad MARC records.
12:13 Dyrcona I haven't run the upgrade script.
12:13 * Dyrcona thinks it is time to investigate lxd.
12:23 bwicksall I'll run through the update step by step
12:24 khuckins joined #evergreen
12:29 jeffdavis bwicksall: At a glance it doesn't look like a bug in the upgrade script, but an issue with updating some record in your system which has bad character encoding (the "maintain_control_numbers" function usually runs when a biblio.record_entry record is updated).
12:30 jeffdavis I'm trying to think of a good way to find the record(s).
12:32 Dyrcona write something to iterate through all of the biblio.record_entry table entries, spit out the id, then call maintain_control_numbers or just update the record with update on same marc set to true.
12:32 Dyrcona The last record id you see will be the first one that causes a problem.
12:33 Dyrcona It would help to do them in id order with a way to specify a starting record id.
12:33 Dyrcona repeat that until none blow up.
12:34 jeffdavis That's basically a full reingest though right?
12:34 Dyrcona Yeah, pretty much.
12:34 Dyrcona But, that's how you'll find all the bad records.
12:35 Dyrcona Or, I guess you could pull the marc out and try to make MARC::Record(s) out of 'em, but I've seen that succeed even on some bad records.
12:36 jeffdavis I wonder if xml_is_well_formed(marc) would catch encoding issues?
12:37 Dyrcona jeffdavis: Don't think so, since it runs every time a record is inserted or updated.
12:38 Dyrcona Of course, if triggers were disabled during a batch load, then..... ;)

Results for 2017-09-14

10:26 berick mdriscoll++
10:26 mdriscoll Could the update in 1075 be run in parallel? I have 16 cores and 15 have nothing to do.
10:28 mdriscoll s/1075/1057/g
10:34 Dyrcona Seems to me it would make sense to modify the triggers so that maintain_901 and maintain_control_numbers are only called if the MARC is changed.
10:39 Dyrcona mdriscoll: You could try doing it in parallel to see. I think it could be.
10:52 * kmlussier agrees with berick, et al. on disabling the trigger for the upgrade.
10:56 miker there are no triggers that need to fire during that particular script.  I'm still in favor of wrapping the relevant guts of of 1057 in set session_replication_role replica/origin ... pg naming decisions aside, all that does is say "don't fire triggers / do fire triggers"

Results for 2017-09-12

16:05 roycroft does evergreen have a way of doing those imports?
16:05 gmcharlt yeah, that's a reasonable plan
16:05 roycroft i.e. can i uploade the csv to evergreen and it will go suck down the details?
16:06 Dyrcona roycroft: You'll need to convert the CSV to MARC.
16:06 roycroft there's a library system that covers most of eastern oregon (i'm in western oregon) who use evergreen, and folks seem to think it's a really nice package, which is what got me looking at it in the first place
16:06 roycroft so csv to incomplete marc, and then evergreen can get the full marc records?
16:06 Dyrcona Also, if you want to test on stretch use the branches on the bug I referenced earlier.

Results for 2017-08-30

13:20 Dyrcona Yeah. We could be approaching an event horizon.
13:21 * Dyrcona greps the code for examples of xlst_process used int he database, specifically for mods transformation.
13:24 Dyrcona Easy enough... :)
14:00 Dyrcona @marc 035
14:00 pinesol_green Dyrcona: A control number of a system other than the one whose control number is contained in field 001 (Control Number), field 010 (Library of Congress Control Number) or field 016 (National Bibliographic Agency Control Number). (Repeatable) [a,z,6,8]
14:00 * Dyrcona ponders what to call it in the spreadsheet.
14:03 dbs That needs a 'q' as well. Dang old data.

Results for 2017-08-16

14:55 Dyrcona lasse_: Evergreen uses MARC21 with a heavy emphasis on Library of Congress standards, but it is used successfully in other nations, such Czech Republic.
14:55 csharp berick: works great!  I'll create a signoff branch so it can get into the 3.0 mix :-)
14:55 berick csharp: cool
14:56 Dyrcona lasse_: You might want to look at Koha, too. IIRC, they have support for different MARC formats.
14:56 lasse_ Dyrcona: thanks - I'm already looking :)
14:57 rhamby I once saw a presentation about a map library in Denmark using a MARC variant called danMARK but I don't know how widely it's used there
14:57 lasse_ rhamby: that would be a little on the nose methinks :)

Results for 2017-07-27

11:14 Dyrcona Bmagic: If you want one, it's probably the other. ;)
11:14 Bmagic I tried both
11:15 Dyrcona Is there a date in the 240?
11:15 Dyrcona @marc 240
11:15 pinesol_green Dyrcona: The uniform title for an item when the bibliographic description is entered under a main entry field that contains a personal (field 100), corporate (110), or meeting (111) name. [a,d,f,g,h,k,l,m,n,o,p,r,s,6,8]
11:15 Dyrcona No.... I'm thinking of the 260....
11:15 Dyrcona :)

Results for 2017-07-26

12:09 berick miker: thanks.  may have lucked out in this case, found an index roughly matching the marc query I wanted to make
12:10 berick i guess the broader question, short of an index or QP change, it's not possible (in the UI) to easily go from marc tag query to bucket
12:38 jihpringle joined #evergreen
12:46 Dyrcona Freddy_Enrique: MARC tags > 900 are not defined in the standard. They are left alone for local use.
12:48 Freddy_Enrique Dyrcona: thanks, I wasnt aware of that
12:49 Freddy_Enrique btw, are you the same on the other forum?
12:49 Dyrcona Freddy_Enrique: That's true for the USA, it may be different in other countries. :)

Results for 2017-07-05

12:57 Freddy_Enrique Dyrcona, in order to make my record appear on the OPAC, I must create an item first right?
12:58 Dyrcona Freddy_Enrique: Usually, yes.
12:58 Freddy_Enrique I made a clean instalation, I created the org units, created some users (with staff permissions) and then went directly with the records
12:58 Dyrcona Or a located URI, i.e. a URL in a 856 MARC tag with a subfield $9 and a couple of other conditions.
12:58 Freddy_Enrique i could see them in the xul
12:59 Freddy_Enrique but I could not visualize it on the opac
12:59 Freddy_Enrique !
13:09 Freddy_Enrique Marcedit software can help me with that
13:09 Dyrcona How are you importing your records?
13:10 Freddy_Enrique Well,  I had some experience with migration. But, the final format of my excel records ended with *mrc.
13:10 Dyrcona Assuming that you're coming from an old system, it would be good if you can get it to put the information you need into some non-standard MARC tag during the export.
13:11 Freddy_Enrique here if I'm not mistaken, is marcxml
13:12 Freddy_Enrique uhm.... what if the records are contained in csv? would be much easier?
13:12 Dyrcona Well, I don't know. That depends on what you're starting with.
13:13 Dyrcona Here's one way to do it: http://docs.evergreen-ils.org/2.12/_mig​rating_your_bibliographic_records.html
13:13 Freddy_Enrique I have my records in many formats, just to clarify those doesnt go beyond the 1000
13:14 Dyrcona When I said "records" I meant the MARC records, and not other records.
13:14 Dyrcona I've done a migration or 4, and it depends a lot on what you're starting with and what access you have to the original system.
13:16 Freddy_Enrique Here there are maaany libraries that... have been working with just excel
13:17 Dyrcona So, they've been using Excel to track their patrons, items, and transactions?
13:19 Freddy_Enrique marc records then... I have also exported marc record with  *mrk  extension
13:20 Freddy_Enrique mrc and csv
13:20 Freddy_Enrique Thanks Dyrcona, I'll really need it
13:20 Dyrcona mrk is usually a special format called "MARC Breaker" format.
13:21 Dyrcona It is used by MARCEdit, MARC Breaker, and some other programs.
13:21 Freddy_Enrique With this format I can edit the marc records. The mrc is just for importing
13:21 Dyrcona Evergreen wants either standard MARC 21 (usually .mrc) or MARCXML (usually .xml).
13:22 Dyrcona Yes, that's right.
13:22 Freddy_Enrique Then...If I have the mrc file, I have most of the work done?
13:23 Dyrcona Well, if you can get the item information in the mrc file, it would be easier to import item and bibliographic information at once.
13:23 Dyrcona You can do that in the staff client import feature, called Vandelay.
13:24 Dyrcona How are you getting the MARC from the CSV? Are you looking the records up somewhere via ISBN?
13:25 Freddy_Enrique Nop, I use MarkEdit to convert my csv files to mrk files
13:26 Freddy_Enrique when I'm done polishing the marc records, I finally convert it to mrc
13:27 Dyrcona I don't use MARCEdit, so I didn't know it could convert csv to MARC, but I guess it wouldn't be too hard to make a rudimentary record.
13:29 kmlussier joined #evergreen
13:29 Freddy_Enrique It is possible, there I can also add a field for the copies/items. In other system, the field is 952
13:29 Freddy_Enrique Is it the same for Evergreen?

Results for 2017-06-27

10:23 pinesol_green [evergreen|Jason Etheridge] lp1208875 make _get_circ_history_csv work with fetch_user_circ_history - <http://git.evergreen-ils.org/?p=​Evergreen.git;a=commit;h=abf4fc5>
10:23 pinesol_green [evergreen|Galen Charlton] LP#1208875: follow-up to standardize extract fields - <http://git.evergreen-ils.org/?p=​Evergreen.git;a=commit;h=292d0fd>
10:23 pinesol_green [evergreen|Jason Stephenson] LP#1208875: Add Release Note - <http://git.evergreen-ils.org/?p=​Evergreen.git;a=commit;h=1dea2ca>
10:23 Dyrcona frank___: You are creating the marc record using the client or web staff client? How are you entering the copyright symbol?
10:24 Dyrcona collaboration++ :)
10:24 frank___ Dyrcona: I am creating it in staff client (Cataloging>>Create New marc record option)
10:25 Dyrcona Are you adding the copyright symbol using the symbol menu or are you typing it in?

Results for 2017-06-16

16:06 jeffdavis but both would be good :)
16:08 Dyrcona I don't think either has such a trigger.
16:08 Dyrcona I could see reasons to do a database update without changing the edit date.
16:08 Dyrcona I often include the edit date when updating the marc in bre, and the editor for that matter.
16:09 mmorgan When we do database updates, we generally set the editor and edit date for clarity.
16:10 Dyrcona I sometimes forget.
16:10 Dyrcona Well, I often forget, let's put it that way. :)

Results for 2017-05-30

10:25 Dyrcona Have to delete it in a transaction.
10:33 Dyrcona Dunnon why the asset.uri entry wasn't removed. One of the 856s has two $9, but not for the library that shows up.
10:34 jvwoolf1 joined #evergreen
10:35 Dyrcona If I was any good with the MARC Editor, I'd remove that extra $9.
10:35 Dyrcona It's also not in my job description, so I don't want to get in any trouble. :)
10:39 csharp Dyrcona: my advice: back away slowly!
10:39 Dyrcona I'll just delete the asset.uri entry from the database tonight.

Results for 2017-05-24

09:40 miker bonus points for cut+parallel
09:40 miker er, split, not cut
09:40 bos20k miker: I think that code is in 0964.data.electronic-resources.sql
09:41 Dyrcona bos20k: Going to 2.12, you'll probably want to skip that though and just update the records in a transaction with reignest on same marc set to true.
09:41 miker bos20k: so it is!
09:42 Dyrcona There's more than just a metabib reingest required, if you want the new goodies, like 901$s.
09:42 bos20k miker: is that faster than the SELECT in 0967.data.more_fixed_fields.sql?
13:11 Dyrcona Nope.
13:12 bos20k er, metabib.reingest_metabib_full_rec() that is
13:12 jeff in the above, my "full" was referencing the style of ingest where you set appropriate flags and execute a query or queries that cause the triggers on biblio.record_entry to fire for all records.
13:12 Dyrcona update biblio.record_entry set marc = marc where....
13:12 Dyrcona Or similar.
13:12 bos20k Oh, so activating all the triggers there...
13:13 bos20k Hmmm, so like the 901$s update but not for only those with a source set.

Results for 2017-05-23

14:51 Dyrcona Removing 856s from bib records.
14:52 Dyrcona The table is for vendors and holds some data about them, including two regexes: one to identify their URLs, and the second to get the vendor's record id from the URL.
14:52 Dyrcona The view just combines that table with asset.uri, uri_call_number_map, and call_number.
14:55 Dyrcona There has been a custom here of combining URLs from different vendors in the same MARC record or "regular" MARC records.
14:56 mmorgan Interesting, so when a library drops a resource from vendor x, it helps you remove the associated 856 fields by vendor.
15:04 mmorgan1 joined #evergreen
15:11 Dyrcona mmorgan1: Yes.

Results for 2017-05-11

09:44 Dyrcona sonOfRa: Evergreen can interface with LDAP, but does not have BitTeX, and other patrons cannot see what other patrons have out, unless you make everyone a staff user, which it sounds like you would want to do.
09:44 collum joined #evergreen
09:44 kmlussier sonOfRa: We do have support for LDAP. We don't export citations yet, but I believe there are some people who use Zotero for their citations.
09:45 Dyrcona sonOfRa: Evergreen uses MARC cataloging. You would need to add a simple dialog.
09:45 sonOfRa Yeah, the MARC cataloging is what kind of turned me off of koha :(
09:46 sonOfRa Hoping to find something that is a bit more agreeable to the poor sod who'll be cataloging the 4000 books that we currently (probably) have.
09:46 kmlussier sonOfRa: I know of some small libraries that use LibraryThing for their collections.

Results for 2017-04-24

08:52 jeff Guest995: how do you generally deliver MARC records to customer libraries? does the library fetch them from you via FTP? SFTP? logging into a web interface? email?
08:52 graced Guest995: well that smells like something we could probably point you in the right direction for... but if you need more help I'd come back into channel after say 10:30am when the bulk of the developers are around
08:53 Dyrcona jeff | Guest995: There is the new e-book API that jeffdavis has been working on. That sounds more like what you're after.
08:53 Dyrcona It requires the ingest of MARC records, though.
08:54 Guest995 Oh, good to know. We have a small interface where customers can come to download their records for their specific content sets. We provide them mrc files and xml files
08:54 Dyrcona It's designed to reach out to the vendors' API and get the circulation status of electronic items, and eventually allow the patron to circulatet, etc. from the PAC.
08:54 jeff Dyrcona: Since the biblioboard content doesn't involve circs, and the ebook APIs for OverDrive / OneclickDigital both depend on MARC records being loaded via normal means, I'm not sure the ebook APIs help here.
11:35 Dyrcona pinesol_green: Damned, Yankee!
11:35 pinesol_green Factoid 'Damned, Yankee!' not found
11:35 pinesol_green Dyrcona: Mr. Spock: Something fascinating just happened.
11:37 Dyrcona marc?
11:37 Dyrcona No. I guess that's not how it works.
11:38 Dyrcona Already, someone is reporting cataloging bugs to me in the webstaff client.
11:39 Dyrcona I only finished the installation about 45 minutes ago.

Results for 2017-04-19

09:48 mmorgan joined #evergreen
09:48 mmorgan kmlussier: Good Morning!
09:49 Dyrcona Good Morning! Good Morning! Guten Morgen!
09:50 Dyrcona So, on that memory issue with exporting bibs, I see a simple script that dumps all marc for non-deleted records is using 11GB of RAM on a debian 7 machine.
09:50 * Dyrcona should write a revised email.
09:50 kmlussier I'm planning to merge a couple of bug fixes this morning since it is a maintenance release day. But, since I left the hackfest early, I just want to verify who is doing what.
09:51 kmlussier berick and Bmagic are in charge of building maintenance releases? Is that right?

Results for 2017-04-10

17:32 Dyrcona Bmagic++
17:32 jamesrf joined #evergreen
17:32 Dyrcona We use 901$c with records sent back from vendors like Backstage.
17:37 Dyrcona jeff: You can, as of 2.9, use the marc stream importer on file batches. Maybe that feature was added earlier.
17:40 jeff berick++ Bmagic++ Dyrcona++ thanks! helpful info.
17:40 jeff Dyrcona: i'd had that in the back of my mind -- i've made use of that before
17:41 jamesrf joined #evergreen

Results for 2017-03-27

15:18 Dyrcona I've been given a screen shot with FILE_UPLOAD_ERROR.
15:19 Dyrcona That occurs 1 time in the code in a function call upload_files, which actually appears to run after the file is uploaded (by Vandelay?).
15:21 kmlussier Yeah. I've been able to find a more useful error in the logs when I've seen that.
15:22 Dyrcona Well, it is supposed to log unable to read MARC file $filename, but I can't find that anywhere.
15:23 * Dyrcona thinks his logging is still broken, despite JBoyer's help.
15:23 Dyrcona I was looking for other messages last week and couldn't find them, either.
15:24 Dyrcona So, I'm asking for hints of what else to look for.
15:46 Dyrcona What I don't know is the whole key to look up in memcached.
15:47 Dyrcona And the servers both have about 2GB cached with over 200,000 items each.
15:47 JBoyer Hmm. Things have to be done on both the sending and receiving machines, if that helps. But yeah, if your primary issue is related to Acq or EDI I'm not much help there.
15:48 Dyrcona It's acq creating a po/pikclist from a MARC file upload. I'm told that a couple of libraries report often having to do it more than once lately to get it to work.
15:48 wsmoak joined #evergreen
15:50 JBoyer I could see how that could potentially point to memcache. Do you have all of the same servers listed in the same order everywhere you specify memcache connections? Depending on how and when things load them and then process them, this could happen:
15:51 Dyrcona They should be. I use the same config files everywhere, AFAIK. Unless someone has been messing with them behind my back or some change didn't make it everywhere in the recent server migration.

Results for 2017-03-20

09:20 maryj joined #evergreen
09:38 kmlussier joined #evergreen
09:51 * Dyrcona thinks his initial assessment of a "bug" might be wrong.
10:00 Dyrcona Just doing this: "select distinct * from biblio.record_entry where not deleted" and print the marc field from the result leads to memory starvation.
10:01 Dyrcona That is more or less the query if you do marc_export -a with no other selection options.
10:02 Dyrcona I don't believe the starvation happened on Wheezy or Trusty. I can build a vm on one of those later to test it.
10:02 mmorgan1 joined #evergreen

Results for 2017-03-13

11:28 berick not running the targeter during the normal pull list time seems like a really good idea.
11:29 miker it makes pulling holds a little more predictable
11:32 Bmagic dbs Dyrcona jeff - this is specifically the bash command? Or exporting via vandelay?
11:33 Dyrcona Bmagic: I have no idea about Vandelay, but the problem seems to involve MARC::Record somehow.
11:33 Dyrcona Well, both problems.
11:34 Bmagic alright
11:36 Bmagic I have a handful of custom perl scripts that extract the marc on a regular basis on 16.04. Let me take a look at the one from March
12:47 dbs http://search.cpan.org/~gmcharlt/MARC-XML​-1.0.3/lib/MARC/File/XML.pm#new_from_xml([$encoding,_$format]) suggests its okay, I think
12:51 jeff dbs: from my read of the MARC::Record::XML function's documentation, it's purely a "give me a record with this encoding" thing. While I have encountered "MARC-8 encoded MARCXML" at least once in the wild, it isn't something that I think anyone wants to encourage as being a Thing. :-)
13:14 Dyrcona I'd like to point out that this started being a problem at Perl 5.20 or so.
13:14 Dyrcona I think MARC::Record and/or MARC::Charset need fixes, not marc_export.
13:15 Dyrcona Encode.pm has likely changed on us, again.
13:17 dbs Dyrcona: I'm on ubuntu 14.04 with perl 5.18 fwiw
13:20 Dyrcona I never noticed it on 14.04, but doesn't mean it didn't happen and I was unaware.
14:10 jeff i haven't tested to see how yaz tools handle it
14:11 jeff oh, nevermind -- outstanding pull request from tsbere, actually: https://github.com/perl4lib/marc-perl/pull/4
14:12 jeff though there's something else similar that i saw elsewhere... hrm.
14:14 Dyrcona Writing your MARC record splitter in Perl is remarkably simple.
14:14 Dyrcona I keep words... :)
14:14 jeff and this: https://rt.cpan.org/Public​/Bug/Display.html?id=70169
14:14 Dyrcona Anyway, since I'm messing with marc_export stuff lately, I'd like to make some improvements.

Results for 2017-03-10

08:55 * jeff files a bug with his symptoms
08:55 Dyrcona I'm writing an email to open-ils-dev with something related that I've seen.
08:56 jeff "marc_export creating MARC data that yaz-marcdump dislikes"
08:56 Dyrcona I think it's Encode.pm or possibly MARC::Charset with perl >= 5.20.
08:57 jvwoolf joined #evergreen
09:04 Dyrcona I will test something before I send that email.
09:09 jeff okay, yep. extract_holdings in this environment emits iso-8859-1 in the main holdings file, and i can override this by adding an encoding to the open() call, like this:

Results for 2017-03-09

12:34 dbwells berick: The only thing I wonder off the top of my head is whether restoring overdues would work properly if you go that route.  If it doesn't, that is a bug in itself, but this route would give it a much wider path to surface.
12:35 berick thanks dbwells, I'll give it a shot
12:38 Dyrcona Has anyone successfully used marc_export on Debian 8 Jessie or Ubuntu 16.04?
12:39 Dyrcona I'm seeing an issue with looping over bre output and putting US MARC into a file on both of those distros.
12:39 Dyrcona It's just a loop of fretchrow_harshref, make a MARC::Record object, and write it to a file as US MARC.
12:40 Dyrcona The program eats all the RAM on the VM before any records are written.
12:42 Dyrcona fretchrow_harshref....Right, Raggie? Right on, Scoob!
12:42 berick dude, your harshref'ing my mellow
13:25 dbwells berick: The case of switching voids to adjustments for lost item overdues was not functionally necessary.  It was an attempt to take advantage of a richer set of tools to hopefully, eventually, actually have *more* clarity.  A guy can dream, can't he?
13:28 berick dbwells: also good to know.  and I agree in principle.  it's better.
14:12 NawJo joined #evergreen
14:13 Dyrcona Bmagic: Have you encountered any issues with MARC on Ubuntu 16.04? Am I right you're running Ubuntu 16.04?
14:13 Dyrcona csharp: I guess the same question goes to you, too.
14:14 Dyrcona jeff and I are poking and it's looking like issues with Encode.pm at the moment.
14:15 Dyrcona jeff is looking at Debian 8, and sees similar things, but slightly less bad in some ways. :)
14:21 Dyrcona That's one I see that jeff doesn't.
14:22 Dyrcona If I dump xml and then convert to usmarc with yaz-macdump, things are better.
14:22 Dyrcona Though some of the records fail to parse.
14:22 Dyrcona So, looking like length calculations and Encode.pm in MARC::Record, maybe...
14:25 kmlussier Is this related at all to bug 1584891 ?
14:25 pinesol_green Launchpad bug 1584891 in Evergreen 2.11 "marc_export -i gives incorrect record length in the leader when call numbers include UTF8 characters" [Undecided,Fix committed] https://launchpad.net/bugs/1584891
14:26 * kmlussier has nothing else to offer, but saw similar words in the bug report.
14:28 Dyrcona Well, it could be, but I've seen really nutty behavior in my Boopsie extract program.
14:28 Dyrcona It just uses MARC::Record on the out put of a database query.
14:30 Dyrcona Actually, no. It has to be something else.
14:31 Dyrcona I'm using master fetched as a couple of hours ago.
14:32 kmlussier OK, worth a shot. I'll go back to figuring out why I haven't received e-mail all day.
15:40 Dyrcona It's here if anyone wants to poke it, which I doubt: https://github.com/Dyrcona/boopsie
15:40 berick cure your woopsies with a dash of Boopsie
15:41 csharp berick++
15:41 Dyrcona I still suspect that there are issues with Encode and/or MARC::Record and Perl 5.22 on Ubuntu 16.04.
15:41 Dyrcona berick++ :)
15:41 berick it's fun not to work
15:42 Dyrcona I'll try it with concerto data later. We have some French and Italian records with accents.

Results for 2017-02-17

10:39 kmlussier Would anyone be willing to look at bug 1661747 for JBoyer today?
10:39 pinesol_green Launchpad bug 1661747 in Evergreen "Add get_org_unit_ancestor_at_depth to action trigger reactor helpers" [Wishlist,New] https://launchpad.net/bugs/1661747
10:40 kmlussier Dyrcona: Did you ever get any sample ISBN's to test the czech added content module?
10:41 Dyrcona kmlussier: Better than that. They sent me some MARC records, that I admittedly have not looked at, yet.
10:41 Dyrcona I've been doing C/W MARS stuff this morning.
10:41 kmlussier Dyrcona: Do you think you'll have a chance to look at it today? If not, maybe you can send them my way and I can give it a try.
10:42 Dyrcona Yes, I was just about to say that I have all I need and should be able to look at it today.
14:11 JBoyer Shame xslt 2.0 isn't more usable, some of the built in funcs seem very handy. :/
14:11 JBoyer kmlussier, a newer MODS transform means less custom edits on my end! :D
14:11 Dyrcona berick: I'll have to see if I can find the original from 1929 on youtube later.
14:11 Dyrcona dbs: The marc looks good and is entityized.
14:11 kmlussier JBoyer: Fair enough then.
14:12 JBoyer (though there are a couple things that we want to display that I'm pretty sure will always be locally customized. :/ )
14:12 Dyrcona Bmagic: You're on pg 9.5. Have you had any issue with targeted uris?
14:33 Dyrcona So, I'm ready to sign off on it. kmlussier should I just push it?
14:33 kmlussier Dyrcona: Yes, please!
14:33 kmlussier Dyrcona++
14:34 Dyrcona My lack of skill with the MARC editor notwithstanding. :)
14:34 kmlussier And I get to add another new code contributor to the Evergreen wiki today. :)
14:43 mmorgan1 joined #evergreen
14:50 Dyrcona Y'know what. I think someone else should look at the code changes.
17:58 Dyrcona but yeah, berick, that is cool.
17:58 Dyrcona After I said I wished it had that feature, I thought I'd just give it a try and it worked.
17:59 _adb left #evergreen
17:59 Dyrcona And rather than run make ilscore-install in Open-ILS/src, I think I'll copy the marc templates with cp....
18:01 Dyrcona My day has gone on too long already. I should quit.
18:11 Dyrcona Ok. I'm out for now. May be back over the weekend. Peace, everyone!
18:59 sandbergja joined #evergreen

Results for 2017-01-29

15:57 kenstir joined #evergreen
16:06 kenstir How do I query the 856 field via OSRF?  In the Android app, I provide an "online access" button for electronic resources, and "place hold" for others.  I want to reliably determine whether a record is an online resource and if so the URL.  I have been using "open-ils.search.biblio.re​cord.mods_slim.retrieve", but it is not working in some cases.
16:15 Dyrcona mods won't include the 856, I don't think.
16:16 Dyrcona You could retrieve the MARC of the record and parse that.
16:20 kenstir and can you lend me a clue as to how to retrieve the MARC of the record given a record ID?
16:23 kenstir I am grepping through Open-ILS/src/perlmods for all methods with "marc" in the name but so far this is not fruitful.
16:29 Dyrcona You can retrieve the bre object via pcrud. The marc field will have the marcxml representation of the record.
16:32 kenstir Thanks, that's a great lead!
16:33 Dyrcona You can also try open-ils.supercat.record.marcxml.retrieve
16:34 Dyrcona I'm having fun trying to get FreeBSD syslog to route messages through a script...

Results for 2017-01-24

15:07 pgardella Just wanted to make sure nothing was wrong, like we're not cleaning something up
15:08 berick no, it's just big
15:08 Dyrcona phasefx++ for mentioning that O'Reilly sysadmin humble bundle. It has paid for itself.
15:09 Dyrcona pgardella: Yes, it's big. There's a row for ever subfield of every MARC record.
15:10 phasefx Dyrcona: awesome
15:10 Dyrcona s/ever/every/
15:10 tsbere Plus the leader and some other data ;)

Results for 2017-01-13

15:46 phasefx yeah, marc_cleanup throws out empty tags
15:46 phasefx brb
15:48 jeff phasefx: yaz-marcdump converting to utf8 was how i ended up with an empty 020, yeah.
15:48 Dyrcona I usually write my own stuff in Perl and discard any records that MARC::File::XML doesn't like.
15:48 jeff yaz-marcdump -v -f MARC-8 -t UTF-8 -o marc -l 9=97 input.mrc > output.utf8.mrc
15:49 Dyrcona Oddly enough, it looks like I'll get through this again for real this summer.
15:49 Dyrcona We'll have a new branch added with records from Voyager or something like that.
15:50 Dyrcona I say oddly 'cause I only just found out about the time this conversation started.
15:51 Dyrcona I notice that MARCEdit (is that the program?) can also tolerate some junk in the MARC that MARC::Record and friends don't like.
15:52 jeff i am shocked -- SHOCKED -- to find that there is variance in what different MARC-using software tolerates and emits.
15:52 phasefx jeff: fun.  My chain looks like this: chardet, yaz-marcdump, marc_cleanup, marc2bre, parallel_pg_loader, split, wrap chunks in BEGIN; -- flag munging COMMIT; sql, parallel, quick_metarecord_map
15:52 Dyrcona hah!
15:56 phasefx either marc_cleanup is saving me, or I've lucked up on catastrophic-failure-causing records
15:57 Dyrcona I deal with catastrophic records by using my own parser for binary records.
15:57 Dyrcona Just IO::File with $/ = '\x1f\x1e';
15:58 Dyrcona Put an eval around the MARC::Record->new_from_usmarc() and put any that blow up into a reject file.
15:58 * jeff nods
15:58 Dyrcona I think I got that delimiter correct. That was from memory. :)
15:59 phasefx I think that's what marc_cleanup does

Results for 2016-12-27

08:03 Dyrcona joined #evergreen
08:30 collum joined #evergreen
08:41 mmorgan joined #evergreen
08:49 Dyrcona Nice. Got a record that throws a 500 internal server error every time you load it, but I don't see anything obviously wrong with the MARC.
09:02 JBoyer Dyrcona, sounds like a great record. Want to share?
09:02 Dyrcona Maybe in a bit. I'm being bombarded with issues.
09:03 mmorgan Dyrcona: Merry Christmas!

Results for 2016-12-21

09:55 csharp our cataloger has wanted something similar before to identify "incomplete" bibs
09:56 * Dyrcona has recently found that left joins to mrfr can be slow, but that's basically how I'd do it.
09:57 Dyrcona Oh, and upshot of my Perl threads experiment yesterday is, Don't do it, kids.
09:58 Dyrcona MARC::File::XML is not thread safe because Encode.pm is not thread safe.
09:58 Dyrcona @blame 9 Encode.pm
09:58 pinesol_green Dyrcona: Encode.pm is why we can never have nice things!
10:20 berick can anyone point me at the docs that form the basis for the authority control set mappings, specifically LoC?
10:24 jeff "You map that way so often. I wonder what your basis for comparison is."
10:30 jvwoolf joined #evergreen
10:42 Dyrcona berick: You could start here: http://www.loc.gov/marc/authority/
10:48 berick Dyrcona: yeah I started there...  did not find what I was looking for.
10:48 Christineb joined #evergreen
10:48 berick i'm sure it's in there somewhere
10:55 Dyrcona berick: I don't think there is a simple document for those. I think it comes right out of the authority field specifications.
10:57 berick Dyrcona: i'm happy to dig if I'm digging in the right place.  and i'm probably looking right past it, but I can't seem to find an example of bib field X is controlled by authority field Y (for LoC thesaurus) in the LC docs.
10:58 Dyrcona berick: Yeah. I thought that is what you're looking for and I don't see it, either.
10:59 Dyrcona berick: Mayb this: http://www.loc.gov/marc/authority/ecadcntf.html
11:00 Dyrcona Eh, maybe not. That appears to just explain what to do when you're making a link in a controlled subfield.
11:03 Bmagic Dyrcona: years ago, I programmed a "multi" thread perl program that used Encode.pm, however, to get around the multi thread issue, I spawned my seperate instances that all wrote back to their own pid file
11:04 Bmagic then the master thread periodically checked up on the "threads" by reading that file

Results for 2016-12-20

10:28 * Dyrcona mostly learns why it is discouraged. :)
10:30 Dyrcona ithreads+-
10:42 jwoodard joined #evergreen
10:55 Dyrcona MARC::File::XML is not thread safe....This could get interesting.
11:05 Christineb joined #evergreen
11:20 Callender joined #evergreen
11:36 jihpringle joined #evergreen

Results for 2016-12-09

12:16 Enjabain joined #evergreen
12:18 Enjabain Hello, I know you can limit search results to a create date range, but can you sort?
12:18 Enjabain by Create Date
12:18 Dyrcona Enjabain: You mean create date of the MARC record?
12:20 Newziky joined #evergreen
12:21 Enjabain Not sure... I am hoping to list most recent additions to a specific libraries catalog.
12:22 Newziky left #evergreen

Results for 2016-12-03

10:17 Dyrcona I should probably blog it....
10:46 Dyrcona And the more I play with them, the more I understand where the differences are coming from. :)
10:46 Dyrcona understanding++
10:47 Dyrcona The upshot is, I'm going to skip MARC::Batch and use IO::File with record separator set to parse the files, and then call MARC::Record->new_from_usmarc().
10:47 Dyrcona With eval in the right place, I can skip "bad" records and load the rest.
10:49 Dyrcona It is significantly harder to do that with MARC::Batch.
11:22 Dyrcona Ah, well. Going out, so putting the laptop to sleep.
13:22 bmills1 joined #evergreen
17:01 pinesol_green News from qatests: Test Success <http://testing.evergreen-ils.org/~live>

Results for 2016-11-29

12:52 Dyrcona @dessert 46 bshum
12:52 * pinesol_green grabs some a thick slice of Jubilee Roll for bshum
12:52 Dyrcona Meh. The plugin is "too smart."
12:57 Dyrcona I was looking at the deprecated function because I was looking at alternatives to trying to lookup the item form in an incoming marc records.
12:57 Dyrcona Looks like I'll make an object module to read some data from the database and create lookup tables.
12:58 Dyrcona It'll have methods to look up various attributes for a given MARC::Record.
13:12 * kmlussier rediscovers bug 1522644 and contemplates raising the discussion on the mailing list again to remove that "Transfer All Title" holds button from the web client.
13:12 pinesol_green Launchpad bug 1522644 in Evergreen "webclient: Transfer title holds issues" [Medium,New] https://launchpad.net/bugs/1522644
13:14 mmorgan kmlussier++
15:55 agoben lol, yep!  Sorry about that!
15:58 mmorgan joined #evergreen
16:33 Dyrcona Bit late, I know, but....
16:34 Dyrcona I have a record that blows up in MARC::Batch with this message: utf8 "\xE8" does not map to Unicode at /usr/lib/perl/5.14/Encode.pm line 176.
16:34 Dyrcona I already have strict_off(), so I wonder how to trap that error and keep on going.
16:34 Dyrcona Maybe I should have a look at Encode.pm.
16:45 bmills joined #evergreen

Results for 2016-10-27

16:03 kmlussier JBoyer: I don't know. She didn't say.
16:03 Dyrcona I might have. I don't remember what Groton was using before they joined MVLC.
16:04 Dyrcona What do ya know... They were on Library.Solution from TLC.
16:05 Dyrcona If she can get the records into either binary MARC or MARCXML, they should load.
16:05 kmlussier OK, that gives me enough to write a somewhat helpful response. Thanks!
16:05 kmlussier dbs++ JBoyer++ Dyrcona++
16:06 dbs kmlussier++ # actually responding
16:06 Dyrcona And, here's a quote from the TLC Extraction Services document: The library can extract their own bibliographic MARC records from Library•Solution® using Cataloging Utilities/Extract Records feature.  The extracted file is MARC Communication format.
16:06 kmlussier dbs: I'm surprised I caught it. I usually just assume those e-mails are spam.
16:06 rhamby I've gotten records from TLC before and they were standard bin marc files
16:07 Dyrcona rhamby kmlussier: That's what I suspect the user has. They just don't know it. But I don't know anything about it beyond what I was given and what's in the TLC document that I still have.

Results for 2016-10-25

11:05 csharp well, at the time I authored that, the DB was all I really knew
11:05 csharp in fact, that was an early attempt to use perl instead of bash for everything
11:07 Dyrcona Yeah. I often write my db stuff to use Perl if it does anything more complicated than a couple of SQL statemetns.
11:07 Dyrcona I pretty much always manipulate MARC with MARC::Record rather than trying xml junk in the db.
11:22 maryj joined #evergreen
11:22 sandbergja joined #evergreen
11:41 Christineb joined #evergreen

Results for 2016-10-20

11:11 kmlussier The steps are explained in 4. of the Preparing for order record loading at http://docs.evergreen-ils.org/2.11/_ordering.html. Maybe it will point you in the right direction?
11:12 berick i remember bug #1519465, but that's only for parsing responses
11:12 pinesol_green Launchpad bug 1519465 in Evergreen 2.9 "Purchase Orders with spaces in the name cause problems with EDI processing" [Medium,Fix released] https://launchpad.net/bugs/1519465
11:15 Dyrcona kmlussier: Thanks, but the closest thing there is getting the PO name out of the marc that comes back.
11:15 Dyrcona The installation instructions are not helpful.
11:15 Dyrcona Not for this question.
11:15 Dyrcona I may just open a ticket with Equinox to ask what they did last time.

Results for 2016-10-12

09:14 jeff frustrating with the most reliable identifier for a vendor-provided bib tends to be the URL.
09:15 jeff because ignoring for the moment that vendors change urls from time to time, it gets in the way of an overlay attempt where the reason you're overlaying is because you're changing the url for something like proxying purposes.
09:16 jeff almost need to export, match externally and affix the 901$c from the exported record to the new record, then overlay on exact match (901$c).
09:17 Dyrcona Yep. Reliable and MARC are generally not the same thing.
09:19 jeff 035$a + 020$z might work for some/most of the batch i'm currently looking at, but i'm pretty sure the 035 can go from (VENDOR) to (OCoLC) in the source files.
09:19 * tsbere has attempted to play with match sets in his dev machine, but can't even make one right now for some reason
09:20 jeff i believe the create button for a match set will silently fail if you lack permission.

Results for 2016-09-23

15:30 Dyrcona Doesn't affect this, though.
15:32 dbs with unapi and xml-in-json, I wouldn't be surprised to see us getting closer to .3 megabytes per bib! heh.
15:32 berick yeah, xml-in-json-in-xml
15:33 Dyrcona But MARC sets the limit at 99,999 bytes. :)
15:34 dbs you made me consider xml-in-json-in-xml-in-marc you beast
15:35 * Dyrcona hands out BLOBs.
15:36 Bmagic kmlussier: phew - it turned out to be some sort of browser caching issue - opened incognito and violla

Results for 2016-09-15

10:41 Dyrcona I know the nuts and bolts of it, just looking at the two main options and weighing pros and cons.
10:42 Dyrcona I'm still working on merging the syrup configs with the main opensrf.xml. I love it when the diffs are in a commented block, but I can't tell from the diff. :)
10:43 Dyrcona Turns out that I'm mostly setting things back to the defaults.
10:46 Dyrcona Hmm.. Will it override all of the marc templates, or is it just additive....The code looks additive to me.
10:46 Dyrcona Wel, I'll find out. :)
10:47 tsbere Dyrcona: I suspect it will override any that match and add the rest
10:48 Dyrcona tsbere: That is what it looks like.

Results for 2016-09-08

16:35 Dyrcona Turns out Clark was already running, anyway.
16:35 kmlussier ssieb: I've only had knowledge of a few systems, but, in my experience, the holdings information is usually in one field. The field itself may vary from system to system, as well as the subfields used particular pieces of data. But, like I said, my experience is limited.
16:36 dbs kmlussier: easy for me to come up with ideas, then run away and hope someone else implements them; the hard part is actually making it happen
16:36 Dyrcona ssieb: The records are in a format called MARC. I has numeric fields and usually alphabetic subfields. So everything will be in the same field, but different pieces will be in different subfields.
16:37 ssieb yes, I'm aware of that.  I've learned far more about this stuff than I really wanted to. :-)
16:37 Dyrcona :) That seems to be what usually happens.
16:37 kmlussier ssieb: Welcome to the club! :)
16:39 ssieb It's not showing that it finds the barcode, but maybe that's because that would be attached to the items and those are failing on import.
16:40 Dyrcona ssieb: That's what I suspect.
16:40 kmlussier ssieb: Yes, so if the circ modifier stops the items from importing, then you won't be able to import any of that item information.
16:40 Dyrcona kmlussier mentioned a program called MARCEDit earlier. You can use it to make batches to a file full of MARC records.
16:40 dbwells Dyrcona: now action_triggers in dev, there's the real trouble :)
16:41 Dyrcona You could use it to add the same circ modifier to all of the records and then reimport it.
16:41 Dyrcona dbwells: Not if you redirect all of your server's email to /dev/null. ;)

Results for 2016-08-25

17:11 mmorgan left #evergreen
17:11 berick achievement unlocked!
17:17 Dyrcona heh
17:18 Dyrcona Reading these release notes, I wonder if one or two of these should have been targeted at 2.9.7 but weren't. Specifically the one about marc file extensions.
17:18 * Dyrcona goes searching for the bug.
17:19 Dyrcona Ok looks like that was 2.10 only.
17:19 Dyrcona I should have remembered Janet saying that it didn't happen in production. :)

Results for 2016-08-12

11:58 brahmina joined #evergreen
12:01 gmcharlt tsbere: gut reaction - they're unlikely to cause significant issues, and I'll likely merge it soon after writing some test cases
12:02 gmcharlt my only quibble at the moment is whether to add a flag to specify a strict input mode that squawks if \035 is found inside a record, as there's no (known-to-me) character encoding for MARC records where that would be permissible
12:04 Dyrcona gmcharlt: I've seen a number of MARC records in the wild that do not use valid character encodings for MARC, and sometimes different fields in the same record have different encodings.
12:04 Dyrcona But, I'm going to lunch, so.... ;)
12:04 gmcharlt Dyrcona: well yes :)
12:04 gmcharlt hence "strict" mode

Results for 2016-08-09

16:29 kmlussier Justin__: It sounds like you added the volume without going to the next step to add a copy. There's a library setting that allows you to add both from the same screen, which our libraries find useful.
16:30 kmlussier Justin__: No, I don't think there is a way to do that. I guess the assumption is you would want a cataloger to add the correct call number and to make sure the barcode matches what is on the item.
16:33 Justin__ Makes sense.  Thank you for your help!
16:34 Dyrcona Justin__: Can you get your MARC records with holdings info?
16:34 Dyrcona Assuming you could export them again from the source, that is.
16:34 bshum I can think of some ways to populate dummy data for a given set of MARC bibs, but yeah kmlussier is right that it would be better to have actual correct call numbers for the volumes and then barcodes for the copies.
16:35 Justin__ The records we have came from the WorldCatalog.  We had a record of what books we had, but not holdings there (I'm not even sure you can enter holdings).

Results for 2016-08-08

13:59 bshum Older than that probably
14:00 bshum 2008
14:00 bshum https://github.com/code4lib/supybot-pl​ugins/commits/master/edsu-plugins/MARC
14:01 Dyrcona Yeah, but the MARC records using $q are still in the minority.
14:02 rfrasur joined #evergreen
14:02 dbs unless of course you have a trigger that automatically splits on a space after a valid ISBN in $a and pushes the remainder into $q :)
14:03 csharp dbs: thanks! - I was just looking for a pointer before digging into LOC docs :-)

Results for 2016-08-04

12:35 bmills joined #evergreen
12:35 bmills joined #evergreen
13:16 gsams joined #evergreen
13:47 Dyrcona @quote add <jeffdavis> MARC is like a can of worms, and each worm is wrapped around another, smaller can of worms.
13:47 pinesol_green Dyrcona: The operation succeeded.  Quote #156 added.
13:48 Dyrcona @quote random
13:48 pinesol_green Dyrcona: Quote #111: "< RoganH> Obviously they weren't from the south or they would have tried deep frying it." (added by csharp at 12:22 PM, April 15, 2015)

Results for 2016-08-03

13:29 BookThief "Suspense," How would I do so? Any help is appreciated!
13:30 Dyrcona BookThief: What version of Evergreen?
13:31 BookThief 2.8.3, I believe.
13:31 Dyrcona Evergreen adds genre indexing from the 655 MARC field.
13:31 Dyrcona Oops, I meant to say Evergreen 2.10 adds it.
13:32 BookThief Is it not possible on 2.8.3?
13:32 mrpeters joined #evergreen
13:33 Dyrcona If you want to use the 655 stuff, just run the  0952.data.genre-indexing.sql on your database and then reingest your records.
13:38 Dyrcona If you want a custom index, I suggest visiting the dokuwiki http://wiki.evergreen-ils.org/doku.php?id=start and searching for indexing or custom search index in the search box.
14:00 Dyrcona @marc 655
14:00 pinesol_green Dyrcona: Terms indicating the genre, form, and/or physical characteristics of the materials being described. A genre term designates the style or technique of the intellectual content of textual materials or, for graphic materials, aspects such as vantage point, intended purpose, or method of representation. A form term designates historically and functionally specific kinds of materials distinguished (1 more message)
14:00 Dyrcona @more
14:00 pinesol_green Dyrcona: by their physical character, the subject of their intellectual content, or the order of information within them. Physical characteristic terms designate historically and functionally specific kinds of materials as distinguished by an examination of their physical character, subject of their intellectual content, or the order of information with them. (Repeatable) [a,b,c,v,x,y,z,2,3,5,6,8]

Results for 2016-07-28

14:41 Dyrcona heh
14:50 Dyrcona Zen and the Art of Database Management. :)
14:52 Stompro Hello everyone, are there any tables like metabib.real_full_rec that have the raw subfield data?  I want to grab data where punctuation is important, I need to see colons, commas and slashes, which get stripped out of metabib.real_full_rec.  Thanks
14:53 Dyrcona Stompro: You could use xpath on the marc from biblio.record_entry.
14:53 Dyrcona That is the most common way to do it with SQL.
14:54 Dyrcona But, to answer your question as stated: No.
14:54 jvwoolf joined #evergreen

Results for 2016-06-03

10:33 gsams I go to a local shop called Donut Paradise, really nice lady, supports our summer reading program
10:36 kmlussier terran++ # For organizing a great Bug Squashing Day!
10:36 kmlussier That I mostly missed. :(
10:49 Dyrcona Whee! Fun with new MARC templates: subfield s comes before subfield b in this tag....
10:52 * Dyrcona hands himself a lollipop. :)
11:02 jvwoolf joined #evergreen
11:04 rjackson_isl all this talk of donuts reminds me of an Indiana chain closed down years ago due to "extra" ingredients added by the 4 legged residents of the shop! :(

Results for 2016-05-04

09:07 * pinesol_green brews and pours a cup of Kenya Peaberry Ruera Estate, and sends it sliding down the bar to Stompro
09:13 terran joined #evergreen
09:29 yboston joined #evergreen
09:31 Dyrcona Who knew: http://search.cpan.org/~dchris/Tk-MA​RC-Editor-1.0/lib/Tk/MARC/Editor.pm ?
09:41 bshum "TK-421, why aren't you at your post?"
09:42 abowling joined #evergreen
09:51 Dyrcona bshum: https://plus.google.com/+ReverendEric​Ha/posts/T2yxWEGL24R?pid=628080617314​1581186&amp;oid=103046039519355433778 :p
14:59 csharp The Resistance is the BEST!
14:59 terran @loves
14:59 pinesol_green terran: terran doesn't seem to love anything.
15:00 Dyrcona @whocares MARC
15:00 pinesol_green mllewellyn loves MARC
15:00 pinesol_green bshum, jcamins, rfrasur, Dyrcona, tsbere, gsams, collum, csharp, JBoyer and Bmagic hate MARC
15:00 Bmagic wow

Results for 2016-05-02

09:41 Dyrcona I'll see if I can find the link the LoC recommendation.
09:41 dbwells I don't think there is anything in Evergreen to prevent records with the same 001, is there?
09:41 Dyrcona Yes, there is or at least was.
09:44 Dyrcona https://www.loc.gov/marc/bibliographic/bd001.html says "may," but I swear I read something else from LoC that says "recommend."
09:46 dbwells Probably uniqueness was necessary for the "tcn_value" field on bre, but keeping that in lockstep with 001 has never been a serious goal in the code.  It was a Sirsi carry-over, and now holds a lot of different stuff.
09:47 csharp marc--
09:52 dbwells I think that manipulating the 001/035 stuff complicates life without good reason.  We have so many better ways of reliably handling our own identifiers without putting them into the record blob.  In addition, keeping the option off by default is more data conservative, since it is very easy to turn on and do the update, but not easy to go the other way.

Results for 2016-04-25

13:10 Dyrcona Yeah, that sounds interesting... Would you block deposit items from going into transit, for instance?
13:10 csharp right
13:11 csharp that's the main issue, actually
13:11 Dyrcona We did that by marc type and circ lib, but a deposit entry would be more useful.
13:11 Dyrcona Usually deposit was only charged on marc_type g.
13:11 Dyrcona I don't think anyone charges deposits any more.
13:11 * Dyrcona should really take notes. :)

Results for 2016-03-18

08:46 ethomsen1 joined #evergreen
08:49 Dyrcona joined #evergreen
08:50 krvmga joined #evergreen
09:36 Dyrcona Are there any good examples of queries using xpath in the database to extract fields from bre.marc?
09:37 jeff if you don't care about the fields potentially being normalized, i go with metabib.real_full_rec
09:37 miker Dyrcona: what are you looking for? there are plenty in the code...
09:38 jeff but since you're asking, you probably care about the values being normalized (i.e., don't want the normalization)
09:48 yboston joined #evergreen
09:48 Dyrcona jeff: The staff client doesn't handle now holdable precats the way I'd like.
09:49 Dyrcona I was doing basically what miker posted, 'cept I had "field" not "datafield."
09:49 Dyrcona 'Cause y'know, I had just used MARC::Record->field()
09:51 miker I wish the perl module had always separated those ... the structure is different, and the "tag < 010" rule is just a USMARC thing
09:52 mllewellyn joined #evergreen
09:54 jeff Dyrcona: since it's NCIP, mostly i don't care about the staff client, but... in what way would you like them to behave that they do not currently?

Results for 2016-03-02

11:57 dbwells If I follow the log, I see a bunch of object requests / perm checks, then they just stop.  I should go back and check if they always stop at the same place.
11:59 JBoyer joined #evergreen
11:59 * csharp has never uttered the phrase "MARC is good"
11:59 Dyrcona MARC is MARC.
11:59 csharp trudat
11:59 * Dyrcona goes to get some lunch.
12:00 * Dyrcona also thinks he's finally sorted his restoring Evergreen to a "new" server worked out.

Results for 2016-03-01

14:59 kmlussier Looks like there are still some issues
14:59 berick i was using a validation tool when doing that work.  I think it was called Total Validator
14:59 berick and, yeah, I'd be surprised if there weren't some issues still.  they're easy to miss and it's also easy to add new ones
14:59 Dyrcona On a totally unrelated note, does anyone use MVLC's MARC templates branch?
15:01 kmlussier berick: hmmm...might be a good thing to check for at every major release. To make sure we don't introduce new ones
15:01 Dyrcona @eightball Does anyone use our MARC templates?
15:01 pinesol_green Dyrcona: _I_ don't know.
15:03 Dyrcona Well, neither do I, pinesol_green.
15:03 kmlussier Of course, the question posed to me was regarding 508 compliance rather than WCAG. Apparently an entirely different thing.

Results for 2016-02-22

15:50 Dyrcona Oh, never mind... That works.
15:50 Dyrcona Yeah, it works for me on Pg 9.1 and 9.3.
15:54 * Dyrcona gives a specific name to a generic script..... D'oh!
15:55 Dyrcona I named it for the library and the marc type, but I pass those in as arguments.
16:08 maryj joined #evergreen
16:10 berick kmlussier: I just removed myself from assignee on bug 1527342.  let me know if you have any probs or if it needs rebasing.
16:10 pinesol_green Launchpad bug 1527342 in Evergreen "Maintain patron reading history in a dedicated table." [Wishlist,New] https://launchpad.net/bugs/1527342

Results for 2016-02-09

11:19 lualaba how can be there bad records? also i try to download here http://wiki.evergreen-ils.org/doku.php?​id=evergreen-admin:importing:bibrecords
11:20 Dyrcona lualaba: You did follow the suggests of the bold text in read at the top of the second URL, right?
11:20 * Dyrcona cannot type today.
11:21 Dyrcona lualaba: Also, the Gutenberg records are already binary MARC, you don't have to do anything to convert them, except bunzip them.
11:21 lualaba note that is older instruction?
11:22 lualaba and how to import in db binary MArc records?
11:23 Dyrcona lualaba: Try Vandelay. I was able to import the Gutenberg records through Vandelay last time I tried in 2012.
11:24 Dyrcona The client timed out, but the import eventually finished.
11:24 lualaba i use version 2.8.1
11:25 Dyrcona Vandelay is the "MARC Batch Import/Export" option on the cataloging menu.
11:25 Dyrcona The documentation should tell you all you need to know.
11:25 lualaba i know but there is any limit or need time?
11:26 lualaba after 1000 records i don't see any progress

Results for 2015-12-22

09:32 Dyrcona It's what we call a "brief" record. It will get overlaid from OCLC eventually.
09:32 mrpeters joined #evergreen
09:32 Dyrcona It has the local 590.
09:33 Dyrcona @marc 550
09:33 pinesol_green Dyrcona: Information about the current and former issuing bodies of a continuing resource. (Repeatable) [a,6,8]
09:33 Dyrcona @marc 650
09:33 pinesol_green Dyrcona: A subject added entry in which the entry element is a topical term. (Repeatable) [a,b,c,d,e,v,x,y,z,2,3,4,6,8]
09:33 Dyrcona @marc 500
09:33 pinesol_green Dyrcona: General information for which a specialized 5XX note field has not been defined. (Repeatable) [a,3,5,6,8]
09:34 * Dyrcona is trying to remember what field the titles of a compilation go into.
09:34 Dyrcona That's the field this should be.
09:39 maryj_ joined #evergreen
09:40 Dyrcona csharp++
09:40 Dyrcona heh.
09:40 Dyrcona @blame [marc The]
09:40 pinesol_green Dyrcona: unknown tag The is why we can never have nice things!
09:40 tsbere @blame [quote random]
09:40 pinesol_green tsbere: Quote #62: "< Dyrcona> À propos a migration from TLC: If you have a column called TOTALINHOUSEUSES you should also have TOTALOUTHOUSEUSES must eat cottage cheese! for symmetry's sake." (added by csharp at 11:49 AM, July 22, 2013)
12:40 Dyrcona I imagine the author pwns one fo these: https://plus.google.com/+ReverendEric​Ha/posts/Qn4aTEytdqn?pid=623115200997​6367506&amp;oid=103046039519355433778
12:41 jeff csharp: actually, you'll want to add a criteria to attempt to avoid invalid xml.
12:44 jeff SELECT id FROM biblio.record_entry WHERE xml_is_well_formed(marc) AND xpath_exists('//marc:record/marc:datafiel​d[@tag="505"]/marc:subfield[@code="t"]', marc::XML, ARRAY[ARRAY['marc', 'http://www.loc.gov/MARC21/slim']]);
12:44 Dyrcona jeff: have you seen much invalid xml in your marc records?
12:44 jeff found at least one just now.
12:46 Dyrcona I'm running select id from biblio.record_entry where not xml_is_well_formed(marc) on my development database right now to see what I find.
12:46 phasefx hrmm, there should be a a_marcxml_is_well_formed trigger on bre
12:47 jeff immediate 500 error on supercat marcxml retrieval, mods takes a bit to return an empty collection, standard catalog page returns quickly (but mostly broken), and the MARC Record view in the catalog seems to take a while too.
12:48 jeff delays might be unrelated, but i wonder if something gets... stuck.

Result pages: 1 2 3 4 5 6 7