Evergreen ILS Website

Search in #evergreen

Channels | #evergreen index




Results

Result pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139

Results for 2022-06-11

06:01 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-11_04:00:04/test.49.html>
23:09 mmorgan joined #evergreen

Results for 2022-06-10

02:33 akilsdonk joined #evergreen
06:00 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-10_04:00:02/test.49.html>
07:26 rfrasur joined #evergreen
07:32 rjackson_isl_hom joined #evergreen
07:38 collum joined #evergreen
09:02 csharp_ screw it, I'll rewrite it in perl
09:04 csharp_ do we use warnings and use strict, Barry?  Yes, other Barry, we do.
09:07 JBoyer -Wall -Werror
09:08 JBoyer jeffdavis++ for looking into the http tests.
09:09 JBoyer Not sure the nginx issue is an issue for most testing though, I don't know what benefit there would be to standing that proxy up just for the tests.
09:19 mantis1 joined #evergreen
09:29 * Dyrcona missed something. To the logs!
09:31 Dyrcona JBoyer: Is it a nginx issue or is it the directories missing again?
09:32 Dyrcona csharp_: You can also install Modern::Perl and 'use Modern::Perl;" to get use strict; use warnings; and some other useful things.
09:32 Dyrcona The deb is libmodern-perl-perl
09:33 mantis2 joined #evergreen
09:33 Dyrcona JBoyer: IIRC, none of the other live tests require even Apache to be running, just OSRF services.
09:34 * Dyrcona likes it when a program that I haven't used/touched in 3 years or so still works.
09:36 JBoyer Dyrcona, I was referring to what jeffdavis said yesterday about one of the tests giving a false positive if nginx is running on the test machine. You might have to hit the logs if you're not running a bouncer.
09:36 JBoyer And there are a couple http tests, but not many, no.
09:52 Dyrcona JBoyer: OK. I was out yesterday. Also, I notice that image uploader test failed this morning.
10:15 csharp_ Dyrcona: thanks - I'll check it out
10:16 * csharp_ tries to rid his brain of Bowie's Modern Love
10:17 berick hey csharp_, roam if you want to
14:22 rfrasur joined #evergreen
17:08 mmorgan left #evergreen
17:31 jvwoolf left #evergreen
18:00 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-10_16:00:03/test.49.html>
18:56 pinesol News from commits: LP1950345-Format the Current Hold Groups table in bootstrap opac <https://git.evergreen-ils.org/?p=E​vergreen.git;a=commitdiff;h=1b4664​67109a9013dd3799d69be6e80c968ce433>
19:56 pinesol News from commits: Hold Management page update <https://git.evergreen-ils.org/?p=E​vergreen.git;a=commitdiff;h=2678ee​1dfcc448968ba8e8ae7fd0da1bc69eeecf>

Results for 2022-06-09

06:01 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-09_04:00:02/test.49.html>
07:28 collum joined #evergreen
07:33 rjackson_isl_hom joined #evergreen
07:48 rjackson_isl_hom joined #evergreen
09:25 mantis1 joined #evergreen
09:30 jvwoolf left #evergreen
09:31 jvwoolf joined #evergreen
09:58 mantis1 I upgraded one of our test servers from 3.6.5 to 3.9.  Has anyone had weird spacing problems in the Boopac after an upgrade?  I have no idea why it's spacing out so much like it is.  There is a width=device-width element added in base.tt2, but it doesn't seem to make a difference if it's deleted or not.
10:37 mantis2 joined #evergreen
10:38 csharp_ mantis1: is it stock EG or did you apply customizations on top of the upgrade?
10:39 csharp_ (or conversely, applied stock EG stuff on top of your customized stuff?)
17:02 csharp_ I'm on at&t, which while not awesome either, at least it's not comcast :-)
17:21 mmorgan not_comcast++
17:28 mmorgan left #evergreen
18:01 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-09_16:00:02/test.49.html>
20:14 jeffdavis the 34-lp1787968-cover-uploader.t live test is failing with a 404 on http://127.0.0.1/jacket-upload
20:16 jeffdavis 29-lp1817645-remoteauth-patron-api.t also does HTTP requests, but that test is actually being skipped right now because the test environment's version of LWP::Protocol::https is not >=6.0.7
20:19 jeffdavis and 24-offline-all-assets.t (which does a wget on https://localhost/eg/staff/offline-interface) technically isn't failing but I'm not sure the test is robust - in an environment with an nginx proxy, the test will pass if apache isn't running because the response is a 502 error and the test only checks for 404
20:20 jeffdavis I'm also not really sure that the wget output is being parsed properly
20:22 jeffdavis in other words, none of the 3 live tests that do direct HTTP requests against the test server are really "working"
20:23 jeffdavis something to fix at the hackfest maybe?

Results for 2022-06-08

02:50 berick joined #evergreen
06:00 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-08_04:00:02/test.49.html>
07:25 rjackson_isl_hom joined #evergreen
07:49 collum joined #evergreen
08:05 RFrasur joined #evergreen
15:19 Dyrcona People do get uptight about the queue order, but it's just a guess at best.
15:19 mmorgan We're not using a soft regarget interval. Maybe that would help.
15:21 Dyrcona We've experimented with every half hour, every 15 minutes, etc. Then, it started running into itself and it ended up being 1/hour effectively. We skip the hours between midnight and 6:00 am to avoid the overnight jobs that run during those hours.
15:22 Dyrcona The soft retarget interval might help. John Amundson could probably tell you better why we use it because it was his testing that made us decide on that value. I forget exactly why we ended up with that value.
15:23 * mmorgan would thing you could end up with a LOT more holds being retargeted using a soft interval.
15:25 Dyrcona Our "normal" retarget interval is 48 hours to keep things on pull lists a little bit longer.
15:25 Dyrcona We'd typically see the running into itself happen for a day or so after an upgrade, then it just kept up (probably because of the soft interval), so we changed the schedule.
17:09 mmorgan left #evergreen
17:14 jeffdavis hm, that doesn't work for me unfortunately
17:17 jeffdavis we're using skins, I wonder if that is causing trouble
18:00 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-08_16:00:03/test.49.html>
22:45 jeff jeffdavis: do you see the text-white class in the output html and it isn't working, or does that class not make it to the html seen by the browser?

Results for 2022-06-07

06:01 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-07_04:00:02/test.49.html>
07:32 collum joined #evergreen
07:53 RFrasur joined #evergreen
08:08 Dyrcona joined #evergreen
14:49 csharp__ joined #evergreen
16:12 jvwoolf left #evergreen
17:02 mmorgan left #evergreen
18:01 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-07_16:00:03/test.49.html>

Results for 2022-06-06

06:00 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-06_04:00:02/test.49.html>
07:57 RFrasur joined #evergreen
08:11 collum joined #evergreen
08:24 mantis1 joined #evergreen
16:28 Dyrcona I didn't check the code, but #deleted seems to be looking at deleted copies or call numbers and not deleted bibs.
16:39 bgillap Ahh that would explain it actually if it's for bib records and not copies.
17:17 mmorgan left #evergreen
17:44 pinesol News from commits: LP#1930617: reduce parallel requests initiated by AngularJS holdings editor <https://git.evergreen-ils.org/?p=E​vergreen.git;a=commitdiff;h=314117​0291ccb38a84d3ad68371cd87bbf6cb4b9>
18:00 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-06_16:00:03/test.49.html>
19:53 DaMobi joined #evergreen
19:53 Bmagic joined #evergreen
19:53 dluch joined #evergreen

Results for 2022-06-05

06:00 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-05_04:00:02/test.49.html>
18:01 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-05_16:00:04/test.49.html>

Results for 2022-06-04

06:02 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-04_04:00:03/test.49.html>
08:03 JBoyer joined #evergreen
14:19 JBoyer joined #evergreen
18:01 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-04_16:00:02/test.49.html>

Results for 2022-06-03

06:01 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-03_04:00:04/test.49.html>
07:16 rjackson_isl_hom joined #evergreen
07:42 collum joined #evergreen
08:05 RFrasur joined #evergreen
12:34 * pinesol brews and pours a cup of Colombia Huila Supremo, and sends it sliding down the bar to Character Set
14:14 jeffdavis unapi.metabib_virtual_record_feed is incredibly slow in our production database: ~30 seconds to return results for 10 records, ~2s for a single record. In a Concerto db I get results for 10 records in less than 1s.
14:27 Dyrcona Probably needs an index on something.
15:22 jeffdavis No, it's weirder than that I think.
15:22 jeffdavis It's actually the call to unapi.mmr_holdings_xml from within unapi.metabib_virtual_record_feed that's slow.
15:23 jeffdavis unapi.mmr_holdings_xml takes ~23s for my test query ... but an SQL query that ought to be equivalent to that function takes <1s
15:25 Dyrcona You say the query is "equivalent." Is it identical to the one used in the function?
15:29 jeffdavis looks like our version of unapi.mmr_holdings_xml uses evergreen.array_remove_item_by_value in a few places instead of array_remove
15:30 jeffdavis I'll see if I can make things more exact
16:24 Dyrcona I think you alter the function to change the owner.
16:25 Dyrcona Yeah, ALTER FUNCTION name [ ( [ [ argmode ] [ argname ] argtype [, ...] ] ) ]    OWNER TO { new_owner | CURRENT_USER | SESSION_USER }
16:27 jvwoolf left #evergreen
16:28 jeffdavis unfortunately it doesn't help - changed the owner to evergreen in a test env and the function is still slow
16:32 jeffdavis modifying it to use array_remove doesn't help either
16:36 Dyrcona That's puzzling. It seems like a crazy hit to take for calling a function.
16:40 jeffdavis yeah seems to be 2.5s per item in the array
16:43 Dyrcona jeffdavis: You could try asking in #postgresql. Somebody there might have some idea what's going on.
16:44 jeffdavis Good idea. Thanks for the help!
16:59 mmorgan left #evergreen
18:00 pinesol News from qatests: Failed Running perl live tests <http://testing.evergreen-ils.org/~live//arch​ive/2022-06/2022-06-03_16:00:02/test.49.html>

Result pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139