Evergreen ILS Website

IRC log for #evergreen, 2016-10-26

| Channels | #evergreen index | Today | | Search | Google Search | Plain-Text | summary | Join Webchat

All times shown according to the server's local time.

Time Nick Message
04:02 gsams_ joined #evergreen
06:40 rlefaive joined #evergreen
07:17 rjackson_isl joined #evergreen
07:22 agoben joined #evergreen
08:40 mmorgan joined #evergreen
08:41 mmorgan1 joined #evergreen
08:42 mmorgan2 joined #evergreen
08:43 mmorgan3 joined #evergreen
08:48 bos20k joined #evergreen
08:50 mmorgan3 left #evergreen
08:51 mmorgan joined #evergreen
08:53 rlefaive joined #evergreen
09:00 Dyrcona joined #evergreen
09:09 yboston joined #evergreen
09:35 kmlussier joined #evergreen
09:37 DPearl joined #evergreen
09:50 tsbere Throwing this out here just because: MVLC was tired of bounces from forwarded-on-other-server email addresses being impossible to identify the patron for. So I just added the patron ID as VERP info in the sending address.
09:50 tsbere This required tweaking our mailing lists that we send notices out as as well, but I think the effort was worth it.
10:05 Bmagic interesting
10:11 tsbere If anyone wants more details or just has questions feel free to ask me.
10:14 * tsbere could actually see this as something that might be useful in the docs, but isn't interested in writing it up *for* the docs
10:36 Stompro tsbere, I would love to see details about this.  The SMS notices are the ones that would help me the most, since most of the gateways only return a generic error message.
10:36 Stompro I would be willing to write up something on it also.
10:37 tsbere Stompro: We use a mailing list that can be configured to allow the regex "listname\+[0-9]+@domain" as acceptable addresses and store the "email to send as" in org unit settings.
10:38 * mmorgan was wondering if this might help with SMS notices also. That's where we have the problem, not so much with email.
10:39 tsbere Stompro: My changes to the templates in Evergreen were basically "add .replace('^(<id regex>-notices)@domain$','$1+' _ user.id _ '@domain') to the helper function returning the email address in all of our templates
10:40 * tsbere thinks that MVLC's SMS messages are currently going out as a noreply address, though, so his DB-level search and replace wouldn't have touched them...
10:40 tsbere I should see about changing them now that we would have a hope of doing something about them
10:44 tsbere At any rate, staff can then use the ID in the "To:" email of the bounce/reply/whatever with "retrieve patron via database id" in the client to see exactly which patron the original message was sent to
10:44 tsbere I didn't use the barcode because I can't easily guarantee that the barcode is valid for inclusion in an email address without munging it.
10:46 tsbere Stompro: Any questions?
10:46 Stompro tsbere, sorry, on a phone call... brb
10:47 tsbere mmorgan: I will note that I was spurred into doing this by a hold notification (we think) bouncing after a server in the middle tried to forward it, so the bounced email address isn't on the patron record.
10:47 tsbere The bounce tells us basically *nothing* of use as a result.
10:49 mmorgan gotcha. We get *replies* to sms hold notifications that we can't follow up on for the same reason -  no useful info.
10:50 tsbere mmorgan: I think that is why, until now at least, we left them going out as the fall-through default opensrf.xml-defined "noreply@mvlc.org" - When people get out of meetings I will be asking if I should adjust our SMS messages to go out with the VERP-style info as well.
10:54 tsbere mmorgan: I will note that not all email systems support the VERP-style addresses. Exchange systems, for example, dislike them. So there is that headache.
10:55 jeff there's a lightning talk: stop sending sms via sms-to-email gateways
10:55 tsbere jeff: In this case I think I prefer the sms-to-email gateways. Very few ways in straight-up SMS to sneak in extra identifiers for the response ;)
10:56 jeff you might be prioritizing the wrong thing.
11:01 Stompro tsbere, the one worry I have about VERP is that we have had to tell many customers to specifically whitelist our sending email address, or their ISP will silently delete or move to the spam folder our messages.  I'm assuming that will all be undone if we switch to VERP.
11:02 tsbere Stompro: Yes, adding the VERP in will change the address and make it harder to say "just whitelist X". As far as I know we haven't told anyone to just whitelist a specific address so we haven't had that particular issue.
11:03 tsbere (though some systems may recognize the VERP info as extra and treat it as the same address. I wouldn't put that past Google, at least.)
11:06 Stompro tsbere, did you think about using the AT.event id's instead, so you could tie the response back to the specific events?  I suppose that woudl be ugly though.  eg+123+234433+23838383+28288382+28288382@blah.org, hmm, what is the max length of an email address?
11:07 eeevil Stompro: local part can be 64 characters
11:07 tsbere Stompro: I considered using a circ and/or hold id as well as several other pieces of information. The general goal here, though, was information the *library* staff could use themselves.
11:08 tsbere Stompro: Thus, anything that Evergreen doesn't let you look up easily was not an option. Evergreen lets you load patrons by their ID, and the ID is pretty much guaranteed to be acceptable in the email address.
11:10 tsbere Stompro: Also, patron ids don't tend to change (merging aside) so the email address wouldn't tend to change, so if they whitelist the VERP-style address once they are set so long as they aren't merged and their home library (which we use to pick the from address) doesn't change.
11:12 Stompro tsbere, thanks, that makes sense.
11:16 tsbere Stompro: Finally doing this may have been somewhat spur of the moment, but I put a lot of thought into it before hand. Auto-processing bounces with the VERP information was actually my first thought, though.
11:17 tsbere As I don't see us processing bounces automatically any time soon I figured this was good enough
11:24 sandbergja joined #evergreen
11:26 Stompro tsbere, I'm not sure how this would fit into the documentatin, since much of it is dependant on your mail server/mailing list software.  So it wouldn't exactly be a step by step instructions.  But it would be good to have the concept in there.  I could create something specifically for exim from my own experience.  What mailing list sofware are you using?  I guess it could go over the theory, and then some specific examples.
11:27 tsbere Stompro: We are using mailman, with postfix configured to do VERP in front of it.
11:27 Stompro Or maybe just add a link to the the wikipedia verp article and call it done :-)
11:28 tsbere Stompro: An example of how to dynamically modify the from address to include the VERP info in the templates doesn't seem like something bad to document
11:39 bmills joined #evergreen
11:56 brahmina joined #evergreen
12:05 dbwells joined #evergreen
12:45 collum joined #evergreen
12:58 jihpringle joined #evergreen
12:59 rlefaive_ joined #evergreen
13:01 eeevil anyone on the web team handy?  I have a request...
13:01 miker and I'm now miker
13:02 kmlussier eeevil is an appropriate nick around Halloween time.
13:02 kmlussier And during the elections, for that matter.
13:02 miker mmmm. points!
13:02 kmlussier miker: Is there something I can help you with?
13:03 miker kmlussier: I have an html-ized version of the governance doc, and would like to get the one on the site now replace.
13:03 miker something you can help with?
13:03 kmlussier miker: No, actually, I don't think I can.
13:03 kmlussier Sorry!
13:04 miker ah, no worries
13:05 bshum I can help with that
13:06 bshum I think
13:06 rhamby miker: I can do that
13:06 * bshum defers to rhamby then
13:06 miker :)
13:06 * bshum goes back to enjoying his pizza
13:06 rhamby I was going to defer to bshum when I saw him chime in :)
13:06 miker thanks! rhamby, shall I just email the file to you?
13:06 rhamby miker: sure
13:07 miker it is now flowing through the tubes
13:08 rhamby miker: got it, give me a few minutes to upload it and abuse the html to embed it and all that
13:08 miker thanks again
13:10 kmlussier rhamby++ miker++
13:12 rhamby miker: done
13:35 maryj joined #evergreen
14:02 rlefaive joined #evergreen
14:20 eby joined #evergreen
14:50 lualaba joined #evergreen
14:51 lualaba hello team, i have one question. In asset.copy_location opac visible is false and deleted = t but anyway after restarts also i see record in web staff client version 2.10 any idea?
14:53 miker lualaba: restarts of what, in particular?  could be cached in the apache process
14:54 lualaba i restart all openils processe, apache, memchad, augenn.sh
14:55 lualaba also i remove one library and also this visible in statistic category. what can i restart also? stop apache and start?
14:55 miker where in the web staff client are you seeing the deleted=t ones?
14:56 lualaba in shelving location
14:56 miker those show up in several places. do you mean in the shelving location configuration screen, under Admin?
14:57 tsbere lualaba: I imagine they still show up if assigned as the copy location on copies, despite being deleted. Is that what you mean?
14:58 lualaba i delete it from DB and anyway i see it in web staff client Shelving Location under holding view
14:59 lualaba i see it just in list
14:59 lualaba from DB record is marked as deleted opac visible false asset.copy_location
15:01 lualaba when i add new copy location(shelving library) in added immediately
15:03 tsbere lualaba: I will admit I am not 100% certain, but I believe it is showing up because despite being flagged as deleted it is still assigned to copies.
15:04 miker I agree that's very likely
15:04 lualaba how select all records and assign or not shelving library?
15:07 lualaba select * from asset.call_number where prefix = N ....
15:08 tsbere lualaba: you want asset.copy and the location field. If you need to deal with call numbers you will need a more complex query, of course.
15:09 mmorgan lualaba: asset.copy.location stores the id of the asset.copy_location  er... as tsbere said
15:10 hbrennan joined #evergreen
15:11 lualaba likes so, Thank you i will update location row and will see result
15:15 hbrennan jeffdavis: Do you know what library's OPAC was used for mrpeter's Overdrive presentation at the 2016 conference? I'm poking around at various BC OPAC's but can't find on that's implemented the checkout/download functions (all I'm seeing is OPACs with availability/hold info)
15:15 hbrennan find *one
15:16 kmlussier hbrennan: I'm not sure, but it may have been PINES?
15:16 lualaba i update asset.copy but anyway i see record in shelwing library list
15:17 jeffdavis hbrennan: I can point you at Sitka OPACs that have that stuff turned on, but you wouldn't see most of the checkout or account stuff without a login
15:18 hbrennan jeffdavis: I'm just realizing that
15:18 hbrennan Still waiting on my universal library card....
15:19 jeffdavis heh
15:19 mmorgan lualaba: Could you possibly be confusing asset.copy.location and asset.copy.circ_lib?
15:20 hbrennan Someone at the state library here is claiming that they are NOT downloading records, and are instead receiving ALL catalog info through an Overdrive API .... possible?
15:21 hbrennan downloading records meaning they're not importing MARC records into ILS
15:21 bmills joined #evergreen
15:21 jeffdavis certainly possible, you can search the Overdrive collection via the API
15:22 jeffdavis I don't see how you could integrate the results into EG search results though, unless you're using a discovery layer that handles federated search
15:22 jeffdavis i.e. not using TPAC directly
15:22 hbrennan Back in April it was my understanding that records still needed to be added
15:23 jeffdavis yes, in EG you need to have imported MARC records from Overdrive
15:23 lualaba no. select * from asset.copy_location 1043ttff1515ft  select * from asset.copy where location = 109   result is null
15:23 hbrennan Most of Alaska is now part of a Sirsi consortium, which I see is now buddy-buddy with Overdrive
15:23 dbs Our public library (not Evergreen) just has an entirely separate parallel website for Overdrive stuff
15:23 lualaba select * from asset.copy_location 1043ttff1515ft  select * from asset.copy where location = 104   result is null
15:24 hbrennan dbs: Same. Well, we imported all Overdrive records manually, but after moving to Evergreen have not added a single one (my job changed!)
15:24 dbs It would be reasonably deep voodoo to integrate Overdrive into Evergreen search & display without MARC records
15:24 hbrennan dbs: That's what I was thinking. But it looks like what they're doing
15:24 hbrennan https://jlc-web.uaa.alaska.edu/client/en_US/a​pl/search/detailnonmodal/ent:$002f$002fERC_11​168_7419$002f0$002fOVERDRIVE:a05c8b64-d726-45​b2-81ce-7fffbde72b73/ada?qu=girl+on+the+train​&amp;rt=false%7C%7C%7CTitle%7C%7C%7Cfalse
15:24 rlefaive joined #evergreen
15:24 hbrennan http://adl.lib.overdrive.com/081CF305-419D-43​EC-B6E0-A91829F698AD/10/50/en/ContentDetails.​htm?id=A05C8B64-D726-45B2-81CE-7FFFBDE72B73
15:25 hbrennan Exact language from Overdrive site shows on their OPAC
15:25 tsbere lualaba: Does a *different* location have the same label as the deleted one?
15:27 lualaba you mean label_prefix adn suffix? no
15:29 bmills joined #evergreen
15:29 * mmorgan thinks tsbere is referring to asset.copy_location.name
15:29 hbrennan dbs: You're a master of creating interesting things... Could you share the reason you don't integrate your Overdrive items? Staff time? (That's the biggie for us)
15:32 jeffdavis hbrennan: It's easy enough to incorporate metadata from Overdrive into local OPAC display. It's merging search results from the local system and from Overdrive that is tricky. If they're really not just importing MARC, I presume it's easier with whatever Sirsi product those folks are using than it is with EG.
15:33 jeffdavis Incidentally it's interesting that you should mention this today because I was hoping to solicit feedback on bug 1541559
15:33 pinesol_green Launchpad bug 1541559 in Evergreen "OneClickdigital API integration" [Wishlist,New] https://launchpad.net/bugs/1541559 - Assigned to Jeff Davis (jdavis-sitka)
15:33 hbrennan dbs: My guess is that Sirsi and Overdrive have a business-y deal that somehow makes them both more money
15:34 hbrennan jeffdavis: I'll take a look at your bug
15:36 JBoyer I would expect SIRSI has a simple automated OD importer, you feed it your FTP info and records just happen. We have something like that set up here, but it's not really general purpose enough to share around. Works well for us though.
15:40 hbrennan JBoyer: That's the ideal setup. Since moving to Evergreen my job doesn't allow the time to catalog records anymore (I did all the electronic previous to our migration)
15:41 hbrennan so we haven't added any MARC records for Overdrive items since end of 2012 :(
15:43 Dyrcona According to SIRSI sales people, your Overdrive records just magically show up in the catalog. ;)
15:44 hbrennan Dyrcona: Dang it! Why are they hogging all the magic?
15:44 Dyrcona MVLC loads Overdrive records into the catalog.
15:44 Dyrcona C/W MARS does also, IANM.
15:44 hbrennan Dyrcona: Manually, as if you were cataloging anything else?
15:45 Dyrcona It was manual, imported from OCLC somehow. tsbere maybe working on automating more of that.
15:45 tsbere I wrote a perl script to prep the records. I still import through Vandelay.
15:46 jeffdavis ^ That's what we do too.
15:46 hbrennan Dyrcona: I know a lot of the bigger/IT-staffed use scripts
15:46 hbrennan tsbere: Clarify prep?
15:46 hbrennan tsbere: like you run them through MARC report or something to clean them up?
15:46 * Dyrcona is working on script(s) to import records from other vendors.
15:46 kmlussier hbrennan: NOBLE imports Overdrive records through Vandelay too, along with lots of other ebook records.
15:46 tsbere hbrennan: mainly 856 prep and de-duping
15:47 hbrennan tsbere: Ahhh.
15:47 hbrennan Well this is making me feel better that we aren't the only "old school" people out there
15:48 Dyrcona C/W MARS currently imports most records through Vandelay.
15:48 dbs Yeah, I could totally see using the Overdrive API to generate MARC records, then feed them into Evergreen with a straight SQL "CREATE or UPDATE" based on whatever matchpoint Overdrive gives you, and an 856. Because located URIs rock.
15:48 dbs hbrennan: fwiw, I have nothing to do with our public library other than being a user; all of my Evergreen work is with our university
15:49 hbrennan So any guesses on how far out the Evergreen community is to having this magical ability to "do nothing" and have the records show up in the catalog?
15:50 jeffdavis First step would be to open a feature request in Launchpad I suppose.
15:50 kmlussier hbrennan: As with anything, I guess it's dependent on the will of Evergreen users to make that work happen.
15:50 hbrennan I'm getting pressure to solve this issue, but I could convince staff to wait if there's some future date this could happen. We've already waited 4 years
15:50 hbrennan kmlussier: So technically, it is possible then
15:51 JBoyer OD importers: you say you "use Vandelay" to import the records, do you mean the actual staff client importer or marc_stream_importer.pl (uses vandelay on the backend)?
15:51 dbs I bypass Vandelay entirely and just go straight to SQL
15:51 hbrennan To make matters worse, we left Sirsi to join Evergreen and then the state started a Sirsi consortium just months later. There are only two of us Evergreen libraries in the state
15:51 kmlussier JBoyer: At NOBLE, they are using the client Vandelay now, but have used MARC stream importer in the past.
15:52 jeffdavis JBoyer: We use the staff client, I think because it's what the person doing the imports is used to.
15:53 jeffdavis hbrennan: I don't know your situation, but I would guess you'd be better off doing manual imports via Vandelay rather than waiting for an automatic process that may or not get developed eventually.
15:53 jeff hbrennan: what are you trying to solve?
15:54 hbrennan jeff: The issue of whether we start importing records again, and how
15:54 jeff hbrennan: why did you stop?
15:55 Dyrcona When I say Vandelay, I mean the staff client.
15:55 hbrennan jeff: Migration to Evergreen
15:55 kmlussier hbrennan: How many records are you talking about? I'm thinking the time it takes to load those records wouldn't be too extensive for your library.
15:55 hbrennan jeff: We stopped cataloging for those few months (all records), then once we were on Evergreen I started doing Evergreen duties
15:55 Dyrcona I typically prefer to do what dbs does and go straight to SQL, and that's how the importer that I am working on will function.
15:55 hbrennan jeff: My cataloging replacement started doing other work too, and it just slipped away
15:55 jeff hbrennan: how did you bring the records in prior to your migration to Evergreen?
15:56 hbrennan jeff: Manual cataloging
15:56 jeff hbrennan: do you participate in OverDrive as your library, or as part of a larger consortium?
15:56 hbrennan kmlussier: A few hundred a month. It wasn't crazy, but it took time
15:56 hbrennan jeff: Overdrive is state-wide, but we have Advantage titles too
15:57 kmlussier hbrennan: In your case, are you using located URIs for your Overdrive records or are you adding them with a transparent bib source?
15:57 hbrennan jeff: So the records are given to us in batches, but only for the consortium titles
15:58 jeff hbrennan: reach out to your contact at the consortium or overdrive, inquire about receiving OverDrive "MARC Express" records for your consortium and advantage (locally purchased) titles. Ask them if they can insert the required subfields in the 856 tags.
15:58 jeff If they can't, then your process will likely be "obtain MARC express file from OverDrive, pass through a perl/python script or MARCEdit batch update to add proper subfields in 856 tags"
15:58 jeff If they can add the subfields, even better.
15:59 hbrennan jeff: We can get the Marc Express records "fairly quickly" according to my source at the state
15:59 jeff If you don't currently receive your advantage titles via their MARC Express service, see if that's an option -- you may need to reach out to your sales rep (if you have one, and don't only go through the consortium).
15:59 hbrennan jeff: The full records already contain the 856 to Overdrive, so I assume the same for the Express ones?
16:00 jeff hbrennan: if you get the full OverDrive-resold OCLC records, even better.
16:00 jeff hbrennan: you have some choices with their MARC Express offering, in terms of if they include just the 856 link for the record detail page, or if they also include 856 tags for excerpt/preview and cover image.
16:00 hbrennan jeff: That's my thinking too. Some staff think we should just "quick catalog" these records and not worry about fixing things, but I don't know why we would treat electronic resources any different than a book.
16:01 jeff hbrennan: Making those choices on your own vs as part of the consortium might be tricky, or might be simple.
16:01 hbrennan jeff: Ha!
16:01 hbrennan Anyone have examples of the Express records in their catalog?
16:01 jeff Once you have occasional batches of bibs in MARC format and you or OverDrive have customized the 856 subfields, you can import them into Evergreen in those batches.
16:03 hbrennan And the batches of records were separate by type, so a batch of 200 would be e-books, then another would be e-audio.. so that part was pretty easy to deal with
16:03 * jeff nods
16:03 jeff better than 1) not having bibs or 2) creating one at a time
16:03 hbrennan I guess I was just searching for that magical way of doing things, which doesn't exist yet - totally understandable.
16:03 hbrennan jeff: Oh yeah, that would be awful
16:04 jeff there are a few very local workflows that probably work close to magic, and there are a number of great possibilities for that "magic for everyone" vision in the future.
16:05 hbrennan I hope to live long enough to see the magic for everyone world
16:06 Dyrcona heh. It isn't magic. They get the records from Overdrive and you somehow indicate to Sirsi-Dynix what ones you own.
16:07 kmlussier hbrennan: Yeah, I think once you upload the records, you'll find it doesn't take much time to do the uploads on a monthly basis.
16:07 Dyrcona I believe that they also make use of the Overdrive API.
16:08 kmlussier hbrennan: My one word of caution is that when you do your initial upload, you might want to break up the records into smaller sets. We have trouble with Vandelay uploads that are too large.
16:08 hbrennan kmlussier: How many you do typically import at a time without issues?
16:11 kmlussier hbrennan: Maybe 1,000? The person at NOBLE does smaller batches, but since you're a standalone library, I think you could do more.
16:12 hbrennan Oh jeez. Well, the batches I remember were usually 150-300.. maybe 400 sometimes
16:12 hbrennan and sometimes only 2!
16:12 kmlussier hbrennan: Yes, I think you'll be okay for the monthly batches. I was just thinking about when you play catchup with the records you haven't loaded since migrating.
16:13 hbrennan Oh yeah, that monster...
16:13 hbrennan I'll have to find out how many records that is
16:13 hbrennan I have 230 emails from OCLC with the record batches we haven't done
16:14 kmlussier jeff: I'm curious. Why were you recommending the Express records?
16:15 jeff kmlussier: because they're free and of better quality than those that you might generate using the "metadata" files that they previously had as their only option.
16:15 jeff kmlussier: above, i also stated "if you get the full OverDrive-resold OCLC records, even better" :-)
16:16 collum kmlussier: free is why we use them.
16:16 hbrennan Since the state is already paying for the full records, that's no issue for us. We might use Express for our Advantage copies, if we decide we don't have the budget to buy the fulls
16:16 jeff that said, the marc express records are not without quirks.
16:16 kmlussier jeff: Ah, I missed that part. I thought you were recommending Express over the OCLC ones she was getting. :)
16:16 jeff kmlussier: nope! :-)
16:17 hbrennan Do y'all importers run records through Marc Report (or similar) first?
16:17 jeff though i would be interested in comparing the two again at some point -- i could see potential benefit in locally enhancing the OCLC record with data from the MARC Express record and/or from the OverDrive metadata API
16:17 hbrennan In the past I found that it was worth it\
16:17 kmlussier hbrennan: MARC Report?
16:17 hbrennan kmlussier: error checking software
16:19 hbrennan kmlussier: Finds and helps fix all kinds of record issues, like  conflicts between indicators, punctuation mistakes or missing things
16:19 kmlussier Ah, I can't answer that question. I'm pretty sure MARC Edit is sometimes used for e-records to add subfield 9's, but that's because we're using Located URIs. If you're using a transparent bib source for your e-records, that step wouldn't be needed.
16:28 hbrennan Well, thank you all for your ideas and workflows! This gives me a narrower direction to focus on. Watch out, HERE COMES THE KARMA!  jeffdavis++ kmlussier++ dbs++ tsbere++ jeff++ jboyer++ Dyrcona++ collum++
16:29 Dyrcona It's karma time! It's excellent!
16:40 Christineb joined #evergreen
17:08 mmorgan left #evergreen
20:57 dbwells_ joined #evergreen

| Channels | #evergreen index | Today | | Search | Google Search | Plain-Text | summary | Join Webchat