Time |
Nick |
Message |
00:01 |
|
bshum joined #evergreen |
00:06 |
ATS_JC |
hbrennan++ |
01:28 |
|
eby__ joined #evergreen |
04:17 |
|
tfaile joined #evergreen |
05:12 |
|
phasefx joined #evergreen |
05:30 |
pinesol_green |
Incoming from qatests: Test Success - http://testing.evergreen-ils.org/~live/test.html <http://testing.evergreen-ils.org/~live/test.html> |
07:00 |
|
timf joined #evergreen |
07:03 |
|
Callender joined #evergreen |
07:04 |
|
tfaile joined #evergreen |
07:04 |
|
tfaile joined #evergreen |
07:22 |
|
mtate joined #evergreen |
07:31 |
|
mtate joined #evergreen |
07:51 |
|
kmlussier joined #evergreen |
07:51 |
|
rjackson-isl joined #evergreen |
07:54 |
|
collum joined #evergreen |
08:00 |
|
mrpeters joined #evergreen |
08:04 |
mrpeters |
eeevil: question about build-eg-replication.sh -- should I end up with a slon_tools.conf, in addition to the slonik scripts (which I did get) when running this or does one have to be built by hand using the schema documentation for a particular version? |
08:10 |
eeevil |
mrpeters: I have not personally maintained that script. it looks like it cares about all the schemas that would have tables, though, so it should be close. making sure it gets everything needed for a particular instance is left as an exercise for the reader, of course. also, that's just one part. you also need to actually clone the schema from the master to the secondary, which is outside the scope of that script |
08:11 |
mrpeters |
eeevil: yeah, im familiar with setup once I have a good slon tools conf, i've just been lucky to always be upgrading, not starting from scratch on a slon_tools.conf |
08:11 |
mrpeters |
so, i guess my question is....is there another process that converts the .slonik files generated by your handy script to a slon_tools.conf ? |
08:13 |
mrpeters |
seems like the db info is in the preamble and store_paths slonik files, so perhaps it could, i'm just missing a step |
08:14 |
mrpeters |
your script works quite well, even for not being maintained -- it seems to do a good job of getting the schema of a particular version |
08:19 |
eeevil |
looked around in some non-core repos, but I don't see one. but really all you need is to create a big list of tables and sequences, which is exactly what you get from the queries embedded in that script. so, should be relatively simple to repurpose that data |
08:20 |
mrpeters |
yeah 10-4, i know what one SHOULD look like, just thought i'd double check before i went the manual route |
08:23 |
mrpeters |
eeevil++ thanks |
08:23 |
eeevil |
np |
08:34 |
mrpeters |
for the logs --- http://pastie.org/9082047 use that, and eeevil's script (mainly, the SQL inside to generate the list of pkeyed tables and sequences for replication) and plug them in using the syntax from the few examples in that paste (and of course modify your db connection information) |
08:34 |
|
ericar joined #evergreen |
08:35 |
|
Dyrcona joined #evergreen |
08:38 |
|
Shae joined #evergreen |
08:40 |
|
RoganH joined #evergreen |
08:40 |
|
mmorgan joined #evergreen |
08:47 |
csharp |
mrpeters++ |
09:01 |
Dyrcona |
Software sucks. Discuss. |
09:01 |
csharp |
it means our goal is making software suck less? |
09:01 |
phasefx |
it means we should be programming vacuum cleaners |
09:02 |
* csharp |
runs VACUUM ANALYZE on his living room |
09:03 |
phasefx |
it behoovers you to do that (ducks, runs) |
09:03 |
csharp |
phasefx++ |
09:06 |
Dyrcona |
Y'know, my maternal grandfather would not allow his wife to buy a Hoover vacuum cleaner because he hated President Hoover. |
09:07 |
Dyrcona |
phasefx++ |
09:07 |
RoganH |
I kind of miss loading punch cards as software (yes, I was a child when I helped do this). |
09:08 |
Dyrcona |
Cassette tape storage..... Nah, I don't miss it. |
09:08 |
csharp |
that was how UGA class registration still worked when I was there (punch cards) |
09:09 |
csharp |
it was *amazing* when we were able to register online a couple of years into it ;-) |
09:09 |
RoganH |
I never used cassettes for storage, I went from punch cards to 5 1/4 floppies. |
09:09 |
* csharp |
cranks up 56K modem to relive the memories |
09:10 |
* Dyrcona |
still has a 14.4k modem in a box, and used to share that connection among 3 computers. |
09:10 |
Dyrcona |
I think most of my Russian email spam actually comes from scripts I wrote for dial on demand being translated and posted on "hacker" sites. |
09:12 |
* Dyrcona |
is probably mistaken. |
09:15 |
* Dyrcona |
considers making #callahan an alias for #lucky in search. ;) |
09:18 |
|
mdriscoll joined #evergreen |
09:20 |
|
kbeswick joined #evergreen |
09:21 |
jl- |
morning |
09:22 |
jl- |
voyager besides authority and bib records, voyager also gave me 'item' records and 'mfhd' records.. do I need to treat them specially or can I import them as usual? |
09:26 |
|
eby__ joined #evergreen |
09:27 |
Dyrcona |
mfhd records should follow a standard and have the information in the MARC. The old export tools could supposedly import them. |
09:27 |
Dyrcona |
Item records could be anything at all. |
09:29 |
jl- |
Dyrcona: thanks. I'm just now seeing that the item file is .txt and not .mrc |
09:29 |
jl- |
lol |
09:29 |
|
dluch joined #evergreen |
09:30 |
Dyrcona |
Open it with a text editor and see what it looks like. Chances are it is some of delimited file. |
09:30 |
jl- |
2|280344|416515|Not Charged|Microfiche|||Microfiche| |
09:30 |
Dyrcona |
Voyager support should also be able to provide you with some documentation, I imagine. |
09:30 |
jl- |
very little |
09:30 |
jl- |
vendor lock-in |
09:30 |
jcamins |
Dyrcona: AHAHAHAHAHAHAHAHAHAHA |
09:30 |
jl- |
;) |
09:31 |
Dyrcona |
Eh, well, some vendors are better than others. |
09:32 |
jcamins |
jl-: you might find useful hints in the scripts for migrating Voyager data to Koha at https://gitorious.org/koha-toolbox but I don't recall what's there. |
09:33 |
mrpeters |
jl-: can you paste a sample? |
09:33 |
mrpeters |
maybe a few lines |
09:34 |
mrpeters |
i think i've done a voyager migration before.... |
09:34 |
jl- |
mrpeters: from which one? items or mfhd? |
09:34 |
jcamins |
jl-: I should note that Voyager is one of the ILSes where different sites have different data formats. |
09:34 |
mrpeters |
items |
09:34 |
mrpeters |
mfhd im not familiar with |
09:35 |
jl- |
jcamins: what are the implications of that? I've been using equinox migration tools and they have served me well so far |
09:36 |
jl- |
mrpeters: http://paste.debian.net/hidden/46f59b30/ |
09:36 |
jcamins |
jl-: potentially you'll run into a situation where existing tools don't seem to be giving you the expected result. That's all. |
09:37 |
jl- |
jcamins: yes, thankfully this is a test run |
09:37 |
jl- |
I've already had to do some costumizing |
09:38 |
jl- |
*custom |
09:38 |
jl- |
and thanks for the koha link jcamins, that is the next ILS we are wanting to test |
09:39 |
jcamins |
There are other versions of the migration toolkit, so you might want to look around to see if anyone has anything better. |
09:39 |
dbs |
jl-: mfhd for serials may end up going into serial.record_entry if your library doesn't circulate your serials |
09:39 |
Dyrcona |
jl-: Looking at the code that jcamins pointed out, it appears to expect some item information in the bibliographic (MARC) records. That gets matched up with other information from items.txt later. |
09:40 |
|
yboston joined #evergreen |
09:40 |
mrpeters |
jl: yeah, that can be converted to tab delimited real easy (that's what i would do) |
09:40 |
mrpeters |
did it include any headers? |
09:40 |
mrpeters |
i'm guessing the 2nd and 3rd colums are barcode, and a link to a bib record id, but im not sure in which order |
09:41 |
jl- |
mrpeters: no headers |
09:41 |
mrpeters |
last column seems to be copy locations, column 5 looks to be a good choice for mapping to an eg circ_modifier |
09:41 |
Dyrcona |
jl-: You should examine a few of your bib records to see if there are what appears to be barcodes in them. |
09:42 |
mrpeters |
yeah, can you do a yaz-marcdump on the marc file and paste one record? |
09:42 |
Dyrcona |
jl-: If there are not, you should find out from Voyager how to export them with item information. They should help you with that as that can be required for things other than migration away from them. |
09:43 |
mrpeters |
852, 949, 999, etc. are all popular locations for that item specific info |
09:43 |
jl- |
sure, sec |
09:43 |
* dbs |
is trying to recall whether he used Windsor's MFHD records or the item export when they migrated from Voyager to Evergreen |
09:44 |
jl- |
mrpeters: http://paste.debian.net/hidden/74d9846b/ |
09:44 |
mrpeters |
hmmm was that done with yaz-marcdump? |
09:44 |
Dyrcona |
https://gitorious.org/koha-toolbox/koha-migration-toolbox/source/0c34e57f14034e189383cb1269a7245bc4b2329d:migration/Voyager/biblio_masher.pl#L220 |
09:44 |
mrpeters |
yaz-marcdump yourbibs.mrc | less from cmd line |
09:44 |
jl- |
yes but then I formatted it |
09:44 |
jl- |
for the sql loader |
09:45 |
Dyrcona |
Beginning there can help you see what the different fields in items.txt likely are. |
09:45 |
jl- |
I can give you the pure marcdump file hold on |
09:45 |
mrpeters |
hmm, sure doesn |
09:45 |
mrpeters |
doesnt look like this has copy info |
09:45 |
mrpeters |
but i bet "15787", etc. are the bib id's |
09:45 |
mrpeters |
see if those pop up in your items text file |
09:46 |
jl- |
they do |
09:46 |
mrpeters |
there's your linkage right there then |
09:46 |
mrpeters |
so thats good |
09:46 |
mrpeters |
i think its safe to assume the other number out of column 2 and 3 is the barcode |
09:46 |
Dyrcona |
It looks like items.txt should have the barcode. |
09:47 |
* jl- |
didn't realize eg was going to be a pandora's box |
09:47 |
Dyrcona |
jl-: All ILS are Pandora's Boxes. |
09:47 |
|
BigRig joined #evergreen |
09:47 |
Dyrcona |
Right now, we're talking about Voyager data, not Evergreen. :) |
09:48 |
mrpeters |
jl-: this honestly doesnt look like too bad of a migration |
09:48 |
RoganH |
No, because if they were Pandora's Boxes hope would still be in the old one after you get the evil out. I've never seen anything left in the old ones after a migration but despair. |
09:48 |
mrpeters |
such a small amount of data for each item = less work for you hah! |
09:48 |
RoganH |
Hope is usually in the new system I've moved them to. |
09:48 |
Dyrcona |
RoganH: Hope quickly turns to despair, no matter the software. |
09:49 |
RoganH |
Dyrcona: you do have a point |
09:49 |
|
ats_JC joined #evergreen |
09:49 |
Dyrcona |
mrpeters++ |
09:50 |
Dyrcona |
Yeah, this looks easier than some others I've done.... *cough* TLC *cough*. |
09:50 |
mrpeters |
^^^^^ |
09:50 |
Dyrcona |
koha++ |
09:50 |
Dyrcona |
jcamins++ |
09:50 |
mrpeters |
millenium kicked my ass recently |
09:51 |
ats_JC |
hi!! |
09:51 |
Dyrcona |
I should clarify... It wasn't the format of the data so much as the crappy cataloging and records that some were ISO8859-1 and others were some other charset. |
09:52 |
|
jboyer-isl joined #evergreen |
09:52 |
Dyrcona |
Good [insert time of day for your locale here], ats_JC! |
09:52 |
|
denishpatel joined #evergreen |
09:52 |
ats_JC |
im having problems with evergreen. can you help me :) |
09:54 |
Dyrcona |
ats_JC: Just explain your problem and ask your questions. If someone can help you, and they're paying attention, they will. |
09:54 |
ats_JC |
oh thank you sir! |
09:54 |
ats_JC |
Our library is using athena. We already set upped the server and the client of evergreen, we tried importing the database from athena. I have some questions with evergreen. |
09:55 |
ats_JC |
Imported items are good, already checked their marcs and their good, I'm having problems with call numbers and barcodes. When I try to search using our assigned barcode from athena or item status by barcode its giving me "SL29123 was either mis scanned or not cataloged", |
09:55 |
ats_JC |
the SL29123 is our assigned barcodes from athena |
09:56 |
ats_JC |
then when I try to search using call number it didn't give any results |
09:56 |
Dyrcona |
How are you searching for barcodes? |
09:56 |
ats_JC |
by a barcode scanner |
09:57 |
Dyrcona |
Are you trying to scan them in item status or circulation? Are you using advanced numeric search? |
09:57 |
ats_JC |
yes\ |
09:57 |
Dyrcona |
Tell me which one. |
09:58 |
ats_JC |
the item status in circulation |
09:58 |
ats_JC |
the item status |
09:58 |
ats_JC |
we already checked the marcs and their good on their places |
09:58 |
Dyrcona |
ats_JC: Do you have access to the database? |
09:59 |
ats_JC |
i dont have the access right now :) Im from Philippines hehe |
09:59 |
ats_JC |
hmm maybe tomorrow ill give you some screenshots |
10:00 |
Dyrcona |
ats_JC: Your geographic location should have nothing to do with it. There's this thing called the Internet. :) |
10:00 |
Dyrcona |
Nah, don't need screen shots. |
10:00 |
|
kbeswick joined #evergreen |
10:00 |
Dyrcona |
I'm guessing you don't actually have any copies. |
10:00 |
ats_JC |
hmm |
10:00 |
ats_JC |
but when we search on circulation its already appearing |
10:00 |
Dyrcona |
When you can get into the database, run this query: select count(id) from asset.copy; |
10:00 |
RoganH |
ats_JC : you may be searching the bib or marc records but there may not be copy records |
10:01 |
Dyrcona |
What do you mean "when we search on circulation its already appearing?" |
10:01 |
RoganH |
ats_JC: you can have many copies for circulation but one bib describing the item |
10:01 |
Dyrcona |
You said before that they weren't. |
10:02 |
ats_JC |
when we tried to item status by barcode and searching by call numbers |
10:02 |
ats_JC |
i dont know why |
10:03 |
RoganH |
ats_JC: Athena stores call numbers and barcodes in the MARC, Evergreen uses separate records for those |
10:03 |
ats_JC |
oh |
10:04 |
RoganH |
That's why Drycona wanted the info from the asset.copy table, that would give more information. |
10:06 |
ats_JC |
hmm thanks guys |
10:06 |
ats_JC |
ill check it out tomorrow |
10:06 |
ats_JC |
then show some screens :) |
10:07 |
ats_JC |
its really driving me crazy haha |
10:07 |
|
kmlussier joined #evergreen |
10:07 |
ats_JC |
its still possible even the marc tags on marc records are all present |
10:08 |
RoganH |
There is a learning curve with Evergreen especially in terms of complexity if you're coming from an older product like Athena that was just a few steps removed from a flat text file database. |
10:08 |
RoganH |
The marc records can be completely fine but that doesn't mean there are any call number or copy records present. |
10:08 |
RoganH |
Copy records attach to call number records which attach to bib records (basically) |
10:08 |
Dyrcona |
ats_JC: How did you import the MARC records into Evergreen? |
10:09 |
ats_JC |
we extracetd the data sets from athena |
10:10 |
ats_JC |
then we let our IT staff to do the conversion |
10:10 |
ats_JC |
he followed the steps on the documentary present on the evergreen website |
10:10 |
ats_JC |
we use linux and postgre |
10:10 |
ats_JC |
open-srf |
10:10 |
Dyrcona |
I see. Maybe someone from your IT staff should join the conversation. |
10:11 |
Dyrcona |
There's a lot of documentation on the Evergreen web site and some of it should be removed. |
10:11 |
ats_JC |
oh |
10:11 |
ats_JC |
so thats it |
10:12 |
Dyrcona |
Well, that's not necessarily "it." |
10:12 |
ats_JC |
ahaha |
10:12 |
ats_JC |
so its doomsday |
10:12 |
Dyrcona |
Depending on the method used to import the records, there are different ways to get the records for call numbers and copies created. |
10:12 |
ats_JC |
ahaha |
10:12 |
ats_JC |
noted |
10:13 |
Dyrcona |
No, you can possibly have them made after the fact if you know what MARC fields and subfields hold the necessary information. |
10:13 |
Dyrcona |
It will take some custom code, though. |
10:14 |
|
mceraso joined #evergreen |
10:14 |
ats_JC |
thank you sir. ill try to let him join the conversation |
10:17 |
ats_JC |
Dyrcona++ |
10:17 |
|
atlas__ joined #evergreen |
10:17 |
ats_JC |
RoganH++ |
10:21 |
ats_JC |
have a good day!! :) thanks for the time |
10:30 |
|
kbeswick joined #evergreen |
10:33 |
|
ats_JC joined #evergreen |
10:39 |
ats_JC |
hi guys. how can we record new books into the evergreen? |
10:41 |
Dyrcona |
Well, you could have your staff catalog everything by hand, but it sounds like you already have the records loaded with copy information in some MARC field. |
10:42 |
Dyrcona |
You need to find out what field has that information and what fields in the asset.call_number and asset.copy tables the subfields correspond to. |
10:43 |
Dyrcona |
Then, you need to have someone write a program to extract the information from the marc in the biblio.record_entry table and to create the asset.call_number and asset.copy table entries. |
10:43 |
Dyrcona |
Piece of cake! ;) |
10:44 |
|
kbeswick joined #evergreen |
10:44 |
ats_JC |
oh thank you!!! :) |
10:45 |
ats_JC |
its really hard when you are from athena where you'll just input those infos |
10:45 |
ats_JC |
Dyrcona++ |
10:46 |
kmlussier |
ats_JC: If you do need to manually catalog the books, though, you might want to look at the docs at http://docs.evergreen-ils.org/2.1/html/adding_holdings.html |
10:46 |
Dyrcona |
ats_JC: Its never obvious, regardless of the system. Only experience teaches these lessons. |
10:46 |
kmlussier |
It's for 2.1, but it should still be relevant. It looks like cataloging docs haven't been moved up yet. |
10:47 |
Dyrcona |
And, no, I've never worked with Athena, but I've moved enough data into Evergreen to know what Evergreen expects. |
10:47 |
bshum |
Athena... ugh |
10:50 |
ats_JC |
ahaha Athen....Jurassic |
10:50 |
ats_JC |
ahahaha |
10:53 |
ats_JC |
kmlussier++ |
10:53 |
ats_JC |
Dyrcona++ |
10:53 |
ats_JC |
really guys... thanks!! |
10:57 |
Dyrcona |
ats_JC: If you IT folks are decent with Perl and PostgreSQL, they ought to be able to figure out a way to do what I've suggested. It should only take a couple of days to get it right. That will be faster than enter records by hand. |
10:57 |
Dyrcona |
s/you/your/ |
10:58 |
ats_JC |
ok ill ask him :) |
10:58 |
ats_JC |
ill show the procedures that you recommend :) |
10:58 |
ats_JC |
thank you sir! |
11:01 |
|
RoganH joined #evergreen |
11:01 |
ats_JC |
we are setting up evergreen 2.5.3 version |
11:02 |
RoganH |
off topic: setting up an IRC bouncer finally. Any recommendations for a good debian one? |
11:02 |
Dyrcona |
ats_JC++ |
11:04 |
ats_JC |
our IT said that he need to check the links tomorrow hahaha |
11:04 |
ats_JC |
so faasst |
11:05 |
RoganH |
check links? |
11:05 |
pinesol_green |
[evergreen|Galen Charlton] LP#1306176: force installation of Business::Stripe from CPAN - <http://git.evergreen-ils.org/?p=Evergreen.git;a=commit;h=e8b967a> |
11:05 |
ats_JC |
dont know thats what he says |
11:19 |
|
lcathenry joined #evergreen |
11:20 |
dkyle |
Regarding https://bugs.launchpad.net/evergreen/+bug/1307553, what subfield should the bib source go in? |
11:20 |
pinesol_green |
Launchpad bug 1307553 in Evergreen "bib source should be included in the 901 field" (affected: 1, heat: 6) [Wishlist,New] |
11:24 |
Dyrcona |
dkyle: Subfield s? |
11:27 |
dkyle |
Drycona: s is not taken. and should source be saved in the marc 901 only for schema biblio? |
11:28 |
Dyrcona |
dkyle: I dunno. I just suggest s for Source and 901 is a "custom" field so we can do what we like. |
11:28 |
Dyrcona |
If auth records have source field, maybe they should get it set, too. |
11:30 |
dkyle |
Drycona: guess I should ask in lp. was wondering what orgs might be using what 901 subfields already for their purposes. |
11:42 |
|
RoganH joined #evergreen |
11:45 |
|
RoganH left #evergreen |
11:46 |
|
RoganH joined #evergreen |
11:46 |
|
mdriscoll left #evergreen |
11:52 |
jl- |
which record/line is causing this error: http://paste.debian.net/hidden/bb24a622/ ? would it be 31? |
11:59 |
|
ericar joined #evergreen |
12:00 |
Dyrcona |
I think that is referring to line 31 of the SQL and not line 31 of the input. |
12:12 |
dbwells |
I am off to lunch, but in my judgment things have settled down enough to get 2.6.0 cut later today. Please consider this a last call for objections, or for any small thing needing attention. Thank you. |
12:15 |
|
j_scott joined #evergreen |
12:16 |
RoganH |
dbwells: https://www.youtube.com/watch?v=xHash5takWU |
12:18 |
dbwells |
RoganH++ |
12:49 |
csharp |
historical question (possibly for eeevil): did the reporter ever restrict anything by library? meaning, has data ever been limited by like the workstation owning library or somesuch? |
12:49 |
csharp |
there are people here who seem convinced that you don't have permission to see other libraries' data |
12:50 |
csharp |
I'm convinced that Evergreen has never worked that way, but there are some fierce people who think otherwise |
12:50 |
phasefx |
jeff: did you ever figure out a solution to memcached key eviction? |
12:50 |
RoganH |
csharp: you mean not configured by giving permission to see folders but just working that way? |
12:51 |
phasefx |
csharp: I think the database is wide open if you can write a report at all, if that's what you mean |
12:51 |
csharp |
RoganH: right - not folders/templates/output but the data itself |
12:51 |
csharp |
phasefx: yes, that's what I mean |
12:51 |
phasefx |
but I'm not 100% sure |
12:51 |
csharp |
ok, that's what I thought |
12:51 |
RoganH |
csharp: Not 1.2 onward but I can't say before that |
12:51 |
phasefx |
I think passwd may be excluded somehow now, from actor.usr, for example |
12:52 |
csharp |
right, but as an Athens user, I can report on Gainesville's circ |
12:52 |
csharp |
that's what people are thinking was restricted before |
12:52 |
RoganH |
csharp: given the general design shape of Evergreen I find it hard to imagine it ever did |
12:52 |
csharp |
possibly in the System That Shall Not Be Named |
12:53 |
RoganH |
that would be a nightmare to implement |
12:53 |
csharp |
RoganH: I agree that it would suck |
12:53 |
csharp |
I don't think it's practical nor desirable |
12:53 |
RoganH |
you'd have to look at every data association and tell the system "this org unit isn't assigned to you, you can't pull that" |
12:54 |
jeff |
phasefx: the only solution as i see it, short of throwing memory at the problem, is re-engineering open-ils.auth to treat memcached as a cache, not a persistent session storage mechanism. |
12:54 |
phasefx |
csharp: so you can use suppress_control in fm_IDL.xml to restrict certain fields, but not org scoped |
12:54 |
RoganH |
my lord, if you loan materials like I know GA PINES and SCLENDS do your reports could become a total cluser$%^! |
12:54 |
phasefx |
jeff: yeah, fallback to the db? |
12:55 |
jeff |
phasefx: right. |
12:55 |
csharp |
phasefx: RoganH: thanks for the feedback |
12:55 |
jeff |
phasefx: or "a db" if there are major concerns about it being in "the db" |
12:55 |
phasefx |
jeff: thanks man; I'm going to see if there is some knobs and dials to memcached |
12:55 |
phasefx |
s/is/are/ |
12:55 |
RoganH |
csharp: yeah, based just on the loaning of materials and having to report on the records of others for your own circ I'd say you could dismiss that rumor |
12:56 |
jeff |
phasefx: there are some. i'm willing to compare notes but am unable to today. |
12:56 |
phasefx |
jeff: roger roger, thanks |
12:56 |
* csharp |
wishes it were that easy - the assumption ended up in a requirements doc |
12:57 |
kmlussier |
yboston++ #Your e-mail made me laugh. |
12:58 |
jeff |
we are starting to do "you can only report on YOUR things" in jasper, by essentially chrooting the user to a portion of the org tree. it's a template design thing, not an overreaching db-level constraint, though that's something i've been looking into as well. |
12:59 |
jeff |
since it just came up recently again... anyone here aware of a library using libki for public workstation management? |
12:59 |
jeff |
guess i could expand my question outside #evergreen |
13:04 |
|
jihpringe joined #evergreen |
13:14 |
|
bmills joined #evergreen |
13:18 |
csharp |
so in the closed dates editor, 'Apply to all of my libraries' apparently considers 'my libraries' to be ALL libraries in PINES |
13:19 |
csharp |
is that expected behavior? or is it supposed to be scoped by org level/permission? |
13:19 |
RoganH |
Ah yes, that's burned us a few times. |
13:19 |
csharp |
RoganH: yikes |
13:19 |
gmcharlt |
jeff: #koha is a better bet for finding libki users |
13:20 |
RoganH |
csharp: yeah, nothing good starts with 'Hey Rogan, guess what X did?' |
13:20 |
kmlussier |
csharp: If it is expected behavior, I would say it's a great candidate for a wishlist bug. Yikes! |
13:21 |
csharp |
wishlist, hell, that's done broke |
13:21 |
yboston |
gmcharlt: can you incfrease the WP maximun upload file size to 13 MB from 2 MB so I can update all the confernece slides? |
13:22 |
gmcharlt |
yboston: sure, one moment |
13:22 |
csharp |
"Click Apply to all of my libraries if your organizational unit has children units that will also be closed." - the docs |
13:22 |
csharp |
which is what I expected to happen |
13:23 |
kmlussier |
That would have been my expectation too. |
13:23 |
Dyrcona |
csharp: Did you do it as a consortial admin or did library staff do it? |
13:23 |
csharp |
library staff |
13:23 |
Dyrcona |
I agree that it sounds like a bug. |
13:23 |
csharp |
we tested it on our test system as a local admin |
13:23 |
RoganH |
did the local admin have all working locations assigned? |
13:24 |
csharp |
RoganH: I don't know who did it |
13:24 |
RoganH |
I want to say the issue for us came up when internal help desk staff who had all working locations did it. |
13:24 |
csharp |
it appeared on everyone's closed dates, though, and it's 2 weeks away (our normal checkout duration) |
13:25 |
kmlussier |
Looking at the Closed Dates Editor, my expectation would be that it would apply to all the children of the OU that's selected at the top of the screen. |
13:25 |
csharp |
could someone please test after me? |
13:25 |
kmlussier |
Even if the person using the editor had permission at more working locations. |
13:25 |
csharp |
it may be something local |
13:25 |
RoganH |
I'll test right now. |
13:25 |
csharp |
RoganH: much appreciated |
13:34 |
RoganH |
csharp: just tested with my 2 account. Account A has all working locations. Account B only has the locations in one system. In both cases the closed date applied to all locations where that account has working locations regardless of where in the org tree they applied the date. |
13:34 |
RoganH |
If the "apply to all my libraries" is checked. |
13:35 |
RoganH |
If that is not checked it is only applied to the individual library. |
13:36 |
jeff |
gmcharlt: thanks! (re: libki) |
13:36 |
RoganH |
In neither case does it go down the org unit tree and apply to child org units. |
13:37 |
csharp |
RoganH: thanks |
13:39 |
dkyle |
Jeff: we are testing libki. bott has done a few mods for EG integration |
13:40 |
RoganH |
csharp: one of the reasons I never reported it as a bug is I'm OK with it functioning like that. It means it's not practical for me to set the closed dates for each library system and they have to do it. |
13:50 |
|
jwoodard joined #evergreen |
14:16 |
|
mcooper joined #evergreen |
14:18 |
|
RoganH left #evergreen |
14:18 |
|
RoganH joined #evergreen |
14:23 |
|
geoffsams joined #evergreen |
14:34 |
|
mrpeters joined #evergreen |
14:36 |
dbs |
Dyrcona: finally running your branch with test RDA bibs. Very interesting. |
14:36 |
|
kmlussier1 joined #evergreen |
14:37 |
dbs |
Dyrcona: in the case of record matching (OCoLC)825763702, there's "264 4 ‡c℗2013" that doesn't appear to be displayed anywhere |
14:38 |
* Dyrcona |
is in a meeting. I'll look later. |
14:39 |
dbs |
Also looks like, with more descriptive 7xx fields, we're going to have to revise the added author stuff further (probably picking up from a discussion that bshum and I were having a while back) |
14:39 |
dbs |
cool |
14:43 |
jl- |
still getting this error on the last batch ingest for today: http://paste.debian.net/hidden/bb24a622/ |
14:43 |
jl- |
any ideas how I can hunt down the record responsible? |
14:45 |
csharp |
okay EDI question... we're seeing that all identifer numbers except ISBN are getting ignored, even though they are an option in the lineitem dropdown - how do others account for this? |
14:45 |
dbs |
jl-: I've used a "bisect" approach before: cut your batch in half and try again until you isolate the offender |
14:45 |
csharp |
(when processing a PO via the PO JEDI reactor) |
14:46 |
jl- |
dbs: 1 / 10000 seems like a needle in a haystack |
14:47 |
eeevil |
jl-: bisecting only requires 32 cuts for 4B records ... less daunting than it seems ;) |
14:47 |
eeevil |
and you can weight your bisections. if it fails fast, just take the first 25% instead of cutting in half |
14:48 |
yboston |
gmcharlt: not sure if you were able to change the WP, but you did it still shows 2 MB |
14:53 |
|
hbrennan joined #evergreen |
14:54 |
dbs |
Hmm. Methinks the CRA/MVF stuff might have thrown off the schema.org mappings for Book, MusicAlbum, Map, etc; everything seems to be coming up as a generic CreativeWork |
14:54 |
* dbs |
will dig deeper |
14:55 |
|
kayals_ joined #evergreen |
14:58 |
|
kayals_ joined #evergreen |
15:05 |
dbs |
@later tell Dyrcona dang, the batch of RDA bibs you supplied only has 18 records with "264 ind2=<anything other than 1>" and all of those simply have 264 ind2=4 $c<date>" which doesn't get displayed; that is, |
15:05 |
pinesol_green |
dbs: The operation succeeded. |
15:06 |
dbs |
@later tell Dyrcona that is, the test bibs don't really help with testing that part of the patch. Really useful for the added author / schema.org type matching though! |
15:06 |
pinesol_green |
dbs: The operation succeeded. |
15:06 |
Dyrcona |
dbs: Hmm. sorry. I grabbed records that had all the RDA fields that came up in the IRC discussion. |
15:09 |
dbs |
Dyrcona: no worries! if you want to hand-pick a few that have more interesting 264 fields for your patch's purpose, it would be easy to roll those in |
15:10 |
Dyrcona |
dbs: If I can find the ones that I tested with, sure. |
15:10 |
Dyrcona |
Movies seem to be good choices. |
15:10 |
dbs |
Dyrcona++ |
15:23 |
yboston |
dbs & Dyrcona : the two Berklee catalogers will start picking RDA records to be added to the EG data set |
15:23 |
yboston |
1) I ma sure they will try to pick music related records, any other requests? |
15:23 |
yboston |
2) what do I do with the records once they pick them. who do I send them too? the dev list? |
15:24 |
kayals_ |
how do I configure TPAC to display Electronic Resource field |
15:25 |
kayals_ |
in table.tt2 it has a section where it says args.uris |
15:26 |
kayals_ |
http://paste.evergreen-ils.org/59 |
15:28 |
kmlussier1 |
kayals_: IIRC, you shouldn't need to configure TPAC to display the Electronic Resource field. If the record has an 856 field, it should just display. |
15:28 |
kayals_ |
in summary.tt2 it shows 856 field |
15:29 |
kayals_ |
the record detail page "marc record" shows 856 field |
15:33 |
kmlussier |
table.tt2 is the search results page, right? Using the tpac defaults, I think it only displays the 856 field in the "more details" view of the search results page. |
15:33 |
kayals_ |
yes |
15:34 |
kayals_ |
Electronic resources does NOT show in the search result page but does show up in record summary |
15:34 |
kmlussier |
kayals_: Does it show up in the search results after you click the "Show more details" button at the top of the page? |
15:34 |
kayals_ |
i know other libraries have their search result page show Electronic Resource listed with links |
15:35 |
kayals_ |
we do not use show/hide more details as we removed that tabs |
15:35 |
kayals_ |
i can add it and test it |
15:53 |
kayals_ |
kmlussier - I added back show/hide more details in the results.tt2 file still no luck |
15:53 |
kayals_ |
it does not show electronic resource |
15:56 |
|
kmlussier1 joined #evergreen |
16:07 |
bshum |
kayals_: URI entries for 856 only appear when making use of the $9 trick |
16:07 |
rjackson-isl |
right and don't you need to be logged in to the branch showin gin the $9 as well? |
16:07 |
bshum |
Oh hey cool: https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDwQFjAC&url=https%3A%2F%2Fdocs.google.com%2Fdocument%2Fd%2F1xhAnKg3CSc2Q7kIQYHnqUtGIxEr97EZhYWc5CjhhR6A%2Fedit%3Fhl%3Den_US&ei=bpFNU862OqbO0gHr7ICQCQ&usg=AFQjCNFbOIRCoVJYxHk18fkbULxnMuxoxw&bvm=bv.64764171,d.dmQ |
16:08 |
bshum |
Old docs explaining located URIs back from 1.6 days! |
16:08 |
bshum |
rjackson-isl: Depends actually, by default yes. In 2.6+, there are library settings which change how those are visible in different scoped searches. |
16:09 |
rjackson-isl |
bshum++ we are back at 2.5.2 |
16:10 |
kayals_ |
bshum - Do we use the one from misc_util.tt2 to table.tt2 |
16:11 |
bshum |
kayals_: I'm not sure I understand what you're asking? |
16:12 |
kayals_ |
sorry about that |
16:12 |
kayals_ |
i see some code with $9 trick in misc_util.tt2 |
16:13 |
kayals_ |
i am not sure how to use it to display Electronic Resource in table.tt2 |
16:13 |
bshum |
The subfield 9 is applied to the MARC records individually. Each 856 would have a new subfield 9 with the org unit's shortname to correspond to whether you wanted it to display for a given location. |
16:14 |
eeevil |
bshum: beware, the "exactly like copies" is a bit of a lie in these modern times |
16:15 |
bshum |
eeevil: Indeed :) |
16:16 |
kayals_ |
got it |
16:16 |
kayals_ |
thanks bshum |
16:19 |
kmlussier |
kayals_: Sorry, I had to run out for a bit, but it looks like bshum steered you in the right direction. :) |
16:19 |
kmlussier |
bshum++ |
16:19 |
kmlussier |
@dessert bshum |
16:19 |
* pinesol_green |
grabs a slice of Chocolate Lava Cake and sends it sliding down the dessert bar to bshum |
16:19 |
kayals_ |
thanks guys |
16:20 |
kmlussier |
bshum always gets the good desserts. |
16:20 |
kayals_ |
:) |
16:22 |
eeevil |
dbs: looks like it's the metarecord stuff that is causing schema.org pain. specifically, it looks like the opac.icon_attr global flag changing from item_type to, well, not item_type is causing map misses. looks like schema_typemap may want to be (more easily) configurable? Perhaps a separate template file pulled in by the get_marc_attrs BLOCK? |
16:23 |
eeevil |
or, of course, a new table (or column on ccvm) would do |
16:29 |
eeevil |
dbs: well, I take part of that back. the "icon sub-project" was mixed in with all three of those (mvf/cra/metarecords) ... in any case, the icon part's lack of schema.org awareness (in schema_typemap) is the cause |
16:31 |
hbrennan |
Just received an official url for my library's future Awesome Box! (pardon the interruption, back to business) |
16:35 |
gmcharlt |
yboston: sorry for the delay; it shoudl work now |
16:40 |
yboston |
gmcharlt: thanks, it works |
17:07 |
|
RoganH left #evergreen |
17:13 |
pinesol_green |
Incoming from qatests: Test Success - http://testing.evergreen-ils.org/~live/test.html <http://testing.evergreen-ils.org/~live/test.html> |
17:22 |
|
mmorgan left #evergreen |
18:43 |
bshum |
Hmm |
18:43 |
bshum |
So in Open-ILS/xul/staff_client/server/circ/checkout.js |
18:43 |
bshum |
There's stuff in there for auto_override of certain types of events |
18:44 |
bshum |
Is there any particularly bad reason not to include additional types of events beyond the stock patron penalties? |
18:44 |
bshum |
I guess other than people not paying attention if they scan lots of things that shouldn't be checked out... |
18:44 |
bshum |
Hmm, nevermind it sounds unsafe :) |
18:45 |
bshum |
That said, I think that maybe this would answer the question I was asked earlier today in a meeting about why a circ policy preventing an item checkout would need to be overridden each time you checked out that material type. |
18:45 |
bshum |
Like a rule that says no DVDs for kids, but they make exceptions all the time.... |
19:55 |
|
kmlussier joined #evergreen |
20:39 |
|
Dyrcona joined #evergreen |
21:22 |
|
kmlussier joined #evergreen |
21:27 |
kmlussier |
dbs++ # http://opensource.com/education/14/4/evergreen-library-system |
22:41 |
pinesol_green |
[evergreen|Dan Wells] Translation updates - po files - <http://git.evergreen-ils.org/?p=Evergreen.git;a=commit;h=ab9c784> |
23:55 |
|
zerick joined #evergreen |