Time |
Nick |
Message |
01:46 |
|
remingtron_ joined #evergreen |
01:49 |
|
felicia_ joined #evergreen |
01:54 |
|
rashma_away joined #evergreen |
01:54 |
|
mnsri_away joined #evergreen |
01:57 |
|
genpaku joined #evergreen |
07:11 |
|
rjackson_isl joined #evergreen |
07:55 |
|
Dyrcona joined #evergreen |
08:06 |
|
jvwoolf joined #evergreen |
08:11 |
|
bos20k joined #evergreen |
08:37 |
|
mmorgan joined #evergreen |
08:42 |
|
plux joined #evergreen |
08:51 |
|
bos20k joined #evergreen |
08:58 |
|
lsach joined #evergreen |
09:33 |
|
yboston joined #evergreen |
09:47 |
|
alynn26 joined #evergreen |
10:35 |
|
sandbergja joined #evergreen |
10:41 |
|
khuckins_ joined #evergreen |
11:04 |
|
khuckins joined #evergreen |
11:31 |
|
sandbergja joined #evergreen |
12:03 |
|
nfburton joined #evergreen |
12:20 |
|
khuckins joined #evergreen |
12:31 |
|
jvwoolf1 joined #evergreen |
12:32 |
|
Christineb joined #evergreen |
12:43 |
|
yboston joined #evergreen |
13:26 |
|
kmlussier joined #evergreen |
13:37 |
* kmlussier |
looks at the logs to find out what she missed this morning and finds nothing. |
13:37 |
kmlussier |
Must be Friday. |
13:37 |
Bmagic |
I was about to make a comment to that affect |
13:38 |
mmorgan |
@dessert [someone] |
13:38 |
* pinesol |
grabs some Pecan Pie for csharp |
13:38 |
mmorgan |
Hmm. csharp seems to get all the pies. |
13:40 |
Dyrcona |
Ssh.... |
13:49 |
kmlussier |
@love pie |
13:49 |
pinesol |
kmlussier: The operation succeeded. kmlussier loves pie. |
13:49 |
kmlussier |
@loves |
13:49 |
pinesol |
kmlussier loves parts; YAOUS; Fridays; clam chowder; coffee; new fanged email thing; quassel; magic eightball; trivia; Evergreeners; BBQ; spell check; mobile catalog; new edit links in the catalog; vim; pizza; grep; spring; summer; fall; clam chowdah; ginger beer; DokuWiki; the random magic spells page; Evergreen, the whole kit and caboodle; bug squashing day; acq; reducing mouse clicks; (1 more message) |
13:50 |
kmlussier |
@more |
13:50 |
pinesol |
kmlussier: quickpick; developers who bring us the features we need just when we need them; git; IRC practice time; git quickpick; rolos; bug squashing week; Tom Waits; chocolate chip cookies; tethering; software; and pie |
13:50 |
kmlussier |
There is a lot of food in that list. |
13:51 |
Bmagic |
lol |
13:51 |
Dyrcona |
:0 |
13:51 |
Dyrcona |
:) |
13:52 |
Dyrcona |
@loves |
13:52 |
pinesol |
Dyrcona loves git; sed; OpenBSD; gnu/emacs; and git-quickpick |
13:52 |
Dyrcona |
@hates |
13:52 |
pinesol |
Dyrcona hates JavaScript; and Launchpad Search |
13:53 |
kmlussier |
@hates |
13:53 |
pinesol |
kmlussier hates Launchpad search; Internet Explorer; snow; scheduling meetings; Starbucks; negative balances; undrinkable coffee; winter; blizzards; spam; dojo interfaces; Windows line endings; peanut M&Ms; bad technology days; authorities; pollen; comcast; comcast even more; inconsistent bugs; intermittent problems; and deadlines |
13:53 |
Dyrcona |
@whocares JavaScript |
13:53 |
pinesol |
_adb, csharp, aabbee and Dyrcona hate JavaScript |
14:07 |
* csharp |
is trying to stay away from pie and passes it to mmorgan |
14:07 |
csharp |
@loves |
14:07 |
pinesol |
csharp loves supybot plugins; virtualization; lasagna; logs; clarity; all y'all; upgrades; tpac; git; this venue; google; not being evil; when evergreen problems turn out to be staff error; the Jedi; pgadmin; policy; lynx; autoupdate; coffee; db02; kvm; CWMARS; mobile catalog; vim; snow; pizza; google forms; chicken soup; cilantro; actual fun; supybot; vimdiff; felafel; postgresql; THE RESISTANCE; (1 more message) |
14:07 |
csharp |
@more |
14:07 |
pinesol |
csharp: vacuum full; strong opinions; data; Data from TNG; simple fixes to scary-looking problems; and all the things |
14:08 |
|
jihpringle joined #evergreen |
14:08 |
csharp |
@hates |
14:08 |
pinesol |
csharp hates dojo_hold_policies_interface; SIP; when libraries purchase third party products without testing and blame Evergreen for it not working; reports; the fact that the Base Filters is unnecessarily greyed out when applying an Aggregate Filter and vice versa; evil; reports more; reports even moar; details; reports even more; the fact that the Base Filters is unnecessarily greyed out when (2 more messages) |
14:08 |
csharp |
@more |
14:08 |
pinesol |
csharp: applying an Aggregate Filter and vice versa even more; having to teach SIP2 client vendors about the SIP2 specification; troubleshooting reports; money reports; marc; reports even more than before; the EDI ruby bits; acquisitions; <quote>fun<unquote>; edi; sip2; sip too; sip two; acq; acq more; acq way more than before; omg I hate acq; omg I love acq; hate hate hate; comcast; action_triggers; (1 more message) |
14:08 |
csharp |
@more |
14:08 |
pinesol |
csharp: javascript; and action_triggers more |
14:09 |
csharp |
one would draw the conclusion that I'm a hate-filled person |
14:09 |
kmlussier |
csharp: So what do you think about reports? ;) |
14:09 |
* berick |
steps backwards slowly |
14:13 |
JBoyer |
csharp appears to have missed variants such as sip dos, SIP II: THE SIPPENING, sip++, and sip sip + (RPN) |
14:15 |
berick |
heh |
14:15 |
Dyrcona |
:) |
14:15 |
berick |
SIP III: THIS TIME IT'S PERSONAL (because you made a 64 patron info request) |
14:15 |
* Dyrcona |
puts on Viking hat and chants: SIP, SIP, SIP, SIPPITY SIP! |
14:16 |
Dyrcona |
Upgrade 1106 takes a long time. |
14:16 |
Bmagic |
dbwells++ # email |
14:16 |
Dyrcona |
And, rollback... :( |
14:17 |
Dyrcona |
I spoke too soonish. |
14:18 |
Dyrcona |
Oh, great. Made it to 1120 this time.... and then insert or update on table "z3950_attr" violates foreign key constraint "z3950_attr_source_fkey" |
14:18 |
Dyrcona |
Guess my training server won't be ready on Monday. |
14:19 |
Dyrcona |
Looks like we deleted loc as a Z39.50 source. |
14:21 |
Dyrcona |
Suppose I should UPC for our added sources. |
14:21 |
Dyrcona |
add, that is. |
14:22 |
|
yboston joined #evergreen |
14:23 |
csharp |
we removed LoC as a source too |
14:27 |
Dyrcona |
Well, my development db server is getting a workout. |
14:27 |
csharp |
Dyrcona: applying https://bugs.launchpad.net/evergreen/+bug/1793802 in advance may help on the next run of 1106 |
14:27 |
pinesol |
Launchpad bug 1793802 in Evergreen "Wishlist: Age billing/payment data with circs" [Wishlist,New] |
14:28 |
Dyrcona |
It's frustrating though, running for hours, only to have a rollback, fix that, run even more ours, another rollback, fix, repeat. |
14:28 |
csharp |
Dyrcona: +1 |
14:28 |
csharp |
I did that this weekend |
14:29 |
Dyrcona |
csharp: While that bug/branch looks like it will do the trick, I'm going to pass. |
14:30 |
Dyrcona |
We're not actually going to upgrade for a few months. We upgrade training well in advance. |
14:30 |
csharp |
yeah - makes sense |
14:32 |
Dyrcona |
Since it took about 6 hours to make it to 1120 and blow up, I assume it will be at least that long until it blows up again, so might as well close that window and come back to it on Monday. |
14:32 |
Dyrcona |
@love tmux |
14:32 |
pinesol |
Dyrcona: The operation succeeded. Dyrcona loves tmux. |
14:34 |
jeff |
How long is 1106 taking for you guys? Do you have unusual triggers or unusually-named triggers? |
14:35 |
csharp |
JBoyer: 1106 took me 10 hours on my initial run - down to 4 after applying that bug fix |
14:35 |
csharp |
sorry meant for jeff |
14:36 |
csharp |
I'm working to purge more circs right now to get it down even more |
14:36 |
berick |
my mileage is similar to csharp's |
14:36 |
jeff |
how many rows in money.billing? |
14:36 |
Dyrcona |
That's easy. UPDATE 69378640 |
14:37 |
* csharp |
counts "1, 2, 3, 4, 5... darn - lost count. 1, 2..." |
14:37 |
* Dyrcona |
was typing a long answer to say that I didn't time any upgrade specifically, just estimated based on starting around 8:30 this morning. |
14:37 |
csharp |
before any sort of deletion, we have 195006359 |
14:37 |
berick |
142M in money.billing |
14:38 |
berick |
pines++ taking names |
14:38 |
csharp |
heh |
14:39 |
csharp |
we did a massive billing deletion in 2011 or so in advance of an upgrade, so probably would've been double that |
14:39 |
* jeff |
nods |
14:41 |
Dyrcona |
Just to clarify my "6 hours" is total runtime for all the upgrades up to 1120, though 1106 followed by 1098 have been the longest two. |
14:43 |
* Dyrcona |
added qecho Doing upgrade #### before each upgrade step. |
14:43 |
Dyrcona |
That helps figure out where it was when there is a rollback. |
14:44 |
berick |
my go to is: select clock_timestamp(), 'applying upgrade abc'; |
14:45 |
Dyrcona |
Ooh, shiny! |
14:45 |
Dyrcona |
Maybe I'll steal that. |
14:45 |
* csharp |
is totally gonna do that |
14:46 |
Dyrcona |
It's just a regex replace away. |
14:48 |
csharp |
we should add that to the make_release script |
14:48 |
csharp |
every admin needs it |
14:50 |
Dyrcona |
Shouldn't be that hard. I actually copied out the relevant bits to make just the db upgrade script recently. Put it on my pastebin. |
14:50 |
Dyrcona |
It's a working bash script. |
14:51 |
Dyrcona |
I'm going with something like this: SELECT clock_timestamp(), 1132 AS "Upgrade"; |
14:53 |
kmlussier |
@love tmux |
14:53 |
pinesol |
kmlussier: The operation succeeded. kmlussier loves tmux. |
15:00 |
Dyrcona |
Guess more stuff must be cached on the db server. It made it to 1106 all ready. |
15:01 |
jeff |
are you running with \timing on by chance? |
15:01 |
|
khuckins joined #evergreen |
15:02 |
jeff |
Part of me is curious just for the potential of understanding postgresql a bit more. I'm wondering why our performance and your performance doesn't seem to match up quite the same in terms of time per million rows. |
15:02 |
jeff |
(this came up before, I mused about it then) |
15:04 |
Dyrcona |
jeff I don't normally run with \timing ever. |
15:04 |
Dyrcona |
I don't judge performance on this server. It is a repurposed mail server. |
15:04 |
jeff |
Hah. |
15:05 |
Dyrcona |
It has lots of disk, decent amount of RAM, and tuned for PostgreSQL, but could be tuned better. |
15:06 |
Dyrcona |
My training server is an old db server, but it also runs services, etc. I have not run the upgrades on their, yet. I'm still working out the kinks. |
15:06 |
Dyrcona |
English spelling is stupid. |
15:06 |
Dyrcona |
So, are typos. :) |
15:07 |
Dyrcona |
Anyway, I lot the track of where I was going. I probably said too much already. :) |
15:07 |
Dyrcona |
s/lot/lost/ |
15:36 |
Bmagic |
Everytime I consider this, I become unsure. Holds: they prefer holds at the branch where it is, then branches within the same system, then any branch. If a system has flattened the actor.org_unit_proximity_adjustment.prox_adjustment = 0. Will it still do the same thing? |
15:42 |
Dyrcona |
Bmagic: In general, yes. There are a couple of settings that can change things. |
15:44 |
JBoyer |
The best hold sort OUS can make that as maddening to determine as you like/ |
15:44 |
Bmagic |
we have a system that prefers FIFO, and I have best_sort_order set to request time at the top and they have prox_adjustment=0. That should achieve FIFO. But what happens when a hold from outside of their system has a better request_time |
15:44 |
Dyrcona |
Try it and see what happens. |
15:44 |
Bmagic |
:) that's what iwas thinking! |
15:45 |
JBoyer |
The answer is also impossible for us to say without knowing a lot more about your holds matrix, circ mods, etc. etc. ;) |
15:45 |
Bmagic |
sure, understood |
15:46 |
JBoyer |
It'll probably try to fill the outside hold if it's allowed. |
15:46 |
Bmagic |
yeah, that's what I am thinking |
15:46 |
Dyrcona |
Which way the wind is blowing... Whether it's an African or European Swallow.... |
15:46 |
JBoyer |
And that library will likely stop preferring FIFO fairly quickly |
15:47 |
Bmagic |
age protection works and they use that too |
15:47 |
Bmagic |
it's after the age protection interval that is the issue |
15:47 |
Dyrcona |
FIFO really only works if every location does it, so it's pretty much right out for a consortium. |
15:48 |
Bmagic |
"riht out" (Do I hear a Monty Python reference in there?) LOL |
15:48 |
Bmagic |
"right out" rather |
15:48 |
Dyrcona |
There were two in the past 3 minutes. |
15:48 |
Bmagic |
"1, 2, FIVE" |
15:49 |
Dyrcona |
Three, sir. |
15:49 |
JBoyer |
"The First hold that is placed In shall be the First hold that is pulled Out. Neither the second, nor the third, and the fourth is right out." |
15:49 |
Bmagic |
Yeah, you set me up with European Swallow |
15:49 |
Bmagic |
haha |
15:49 |
Dyrcona |
Holds are not simple. Nothing is in Evergreen. |
15:50 |
Bmagic |
you don't have to tell me, holds! Makes me want to say it like the movie Nerds. NERDS? NEEEAAARDS! |
15:50 |
Dyrcona |
It mostly does the "right thing" until you try to second guess it, then you get into trouble. |
15:51 |
Dyrcona |
Staff always seem to fixate on the one "anomalous" hold out of the tens of thousands place every day. |
15:53 |
mmorgan |
Bmagic: We use Holds always go home, so after age protection expires, items will travel, but will always come home to fill a hold. |
15:53 |
Bmagic |
That might be a good option, though it doesn't fix the newly cataloged items issue |
15:54 |
mmorgan |
Staff check in the newly cataloged items with the Retarget local holds/Retarget all statuses checkin modifiers. |
15:55 |
mmorgan |
Not perfect, but works for the most part. |
15:58 |
Bmagic |
Is it not possible to have the cake and eat it too? Can Evergreen accomodate the settings such that the patrons at that specific system (8 branches) git dibs on the items in FIFO order but only for those patrons. Then when all of the holds are done for that system, only then will it consider other holds? |
16:01 |
mmorgan |
So you want an item to travel around those 8 branches filling the holds chronologically, then after holds are filled within those 8, the item can travel elsewhere? |
16:01 |
Bmagic |
yep |
16:02 |
Bmagic |
even if a hold outside of those branches has a better request_time |
16:02 |
bshum |
Bibliomation did that with one of their largest branch library systems. So it should be possible. |
16:02 |
bshum |
It was FIFO in that system and the branches, and then normal elsewhere |
16:03 |
Dyrcona |
Unless you want to pay for shipping a book around, I don't think you really want FIFO in 8 branches, but what do I know? |
16:03 |
Bmagic |
config.best_hold_order is the issue.. maybe I don't have the right knobs turned |
16:04 |
bshum |
I don't remember how it was done offhand, but yeah it was best hold order configuration stuff. |
16:08 |
Bmagic |
It seems that I want pprox at the top, then priority, then cut, then rtime |
16:09 |
bshum |
I think we changed the proximity too for the libraries |
16:09 |
bshum |
In the branch |
16:09 |
Bmagic |
wait, adjusted capture locatio nto pickup lib prox [approx] at the top? |
16:09 |
bshum |
item circ lib = SYS ; pickup_lib = SYS ; proximity_adjustment = 0 ; absolute = true (checked off) |
16:09 |
mmorgan |
Bmagic: We have pprox at the top, followed by priority, then cut, then depth, then rtime |
16:09 |
bshum |
Then we used aprox ; priority (optional) ; cut in line (optional), rtime |
16:09 |
bshum |
For the sort order |
16:10 |
bshum |
So with the custom proximity change where all the system branches became 0 closer to each other, it forced them to take precedence over the rest of the consortium |
16:10 |
Bmagic |
[pprox] doesn't respect actor.org_unit_proximity_adjustment.prox_adjustment but [approx] does? |
16:11 |
|
bdljohn left #evergreen |
16:11 |
Bmagic |
bshum: that's what I have as well (I think you helped me when we set this up a couple of years ago) |
16:11 |
bshum |
Probably :) |
16:11 |
Bmagic |
probably more like 4 years ago... geez |
16:11 |
bshum |
I'm reading off old emails from 2014/15 era |
16:12 |
bshum |
So yeah |
16:15 |
Bmagic |
thanks yall, off to the test server for many hours |
16:46 |
Bmagic |
I just printed to a Dymo printer from the web client using Hatch |
17:09 |
|
mmorgan left #evergreen |
17:11 |
jeff |
Bmagic++ |
17:12 |
jeff |
congrats! |
17:13 |
Bmagic |
thanks |
17:14 |
Bmagic |
It didn't get anything on the page but hey, no error! |
17:17 |
Bmagic |
Anyone here have a Dymo printer? |
17:18 |
rhamby |
heh, probably in a box somewhere in the garage |
17:19 |
sandbergja |
Bmagic: we've got one! We don't use Hatch, though |
17:19 |
Bmagic |
Looking for someone to test drive this code with Webby/Hatch/Dymo |
17:19 |
Bmagic |
It might be a matter of adjusting the template to get the label to print in the upper left corner or something like that |
17:20 |
Bmagic |
(I don't have the physical printer, just the driver installed. Not getting errors now. Had a library on the phone and got it to print out a blank label but they had to go) |
17:27 |
alynn26 |
I have a dymo let me try, which server do you want me to use? |
17:27 |
Bmagic |
you have to download the new hatch.java file and replace the one on your workstation |
17:27 |
Bmagic |
https://dropbox.mobiusconsortium.org/pickup/hatch.jar |
17:28 |
Bmagic |
sorry, hatch.jar - which would be located at "C:\Program Files (x86)\Hatch\lib" |
17:31 |
alynn26 |
Ok, connect to the Noble server or Webby? |
17:33 |
Bmagic |
Any webby |
17:33 |
Bmagic |
configure hatch to use printing |
17:35 |
alynn26 |
connected to webby.evergreencatalog.com. It's not giving me any printers in the drop down list. |
17:44 |
Bmagic |
weird - restart browser? |
17:44 |
Bmagic |
logout/login |
17:45 |
Bmagic |
but not sure it matters, I've confirmed that there are still some kinks in the code and I've sweet talked one of our libraries to lend me a printer for awhile |
17:52 |
alynn26 |
Restarted worked, I'm not getting any errors in the Console, but it's not actually printing. |
17:55 |
Bmagic |
not even blank? |
17:55 |
alynn26 |
I see where it sends it to hatch, and hatch then responds, but nothing happens, not even a blank label. |
17:55 |
Bmagic |
One of the bugs right now is that the dymo needs to be your default printer |
17:55 |
alynn26 |
Ok |
17:57 |
alynn26 |
set it as the default, same thing. I have a Dymo LabeWriter 450 Duo. |
17:58 |
Bmagic |
alynn26: thanks for trying! I am hacking more on it next week. |
17:58 |
alynn26 |
Ok, if you want me to do some testing, let me know. Just email me. |
17:58 |
Bmagic |
right on! Thanks |
17:58 |
alynn26 |
good night. |
18:08 |
|
jvwoolf joined #evergreen |
21:07 |
|
stevea joined #evergreen |
22:40 |
|
sandbergja joined #evergreen |