Discussion:
IA Upload queue
(too old to reply)
Andrea Zanni
2017-09-12 08:36:41 UTC
Permalink
Raw Message
Dear all,
someone could help understand if we have an issue here?
https://tools.wmflabs.org/ia-upload/commons/init

Some librarians uploaded books months ago,
but they were never processed.
Is the tool working, or it simply never signals when it fails?
Ilario Valdelli
2017-09-12 08:47:37 UTC
Permalink
Raw Message
[2017-08-02 08:00:49] LOG.CRITICAL: Client error: `POST https://commons.wikimedia.org/w/api.php` resulted in a `413 Request Entity Too Large` response: <html> <head><title>413 Request Entity Too Large</title></head> <body bgcolor="white"> <center><h1>413 Request Entity (truncated...) [] []

Or the program is not able to process huge files or, simply, the disk space is finished.

Kind regards

Sent from Mail for Windows 10

From: Andrea Zanni
Sent: 12 September 2017 10:37
To: discussion list for Wikisource, the free library
Subject: [Wikisource-l] IA Upload queue

Dear all,
someone could help understand if we have an issue here?
https://tools.wmflabs.org/ia-upload/commons/init
Some librarians uploaded books months ago,
but they were never processed.
Is the tool working, or it simply never signals when it fails?



---
Questa e-mail Ú stata controllata per individuare virus con Avast antivirus.
https://www.avast.com/antivirus
Sam Wilson
2017-09-12 12:03:19 UTC
Permalink
Raw Message
Yes, this happens when the resultant DjVu file is larger than Commons
will allow. I think 100 MB is the limit?
I'm not sure how to get around this. Perhaps we resize the images
smaller? But we don't want to do that every time, so perhaps we have to
generate the DjVu, see how big it is, and if it's too big resize and
build it again? Would that work?
We could make the over-size DjVu available for download, and then the
user could use a different method to upload to Commons (is there such
a method?).
Suggestions welcome!

A related issue is https://phabricator.wikimedia.org/T161396
I can't find a ticket yet for the request-too-large problem, but I
remember seeing one; anyway, I'll create it again, and perhaps Community
Tech can look into it.
There's also the slight possibility that IA can start creating DjVus
again! Which would be brilliant, but I haven't heard anything about that
since Wikimania.
—Sam.
Post by Ilario Valdelli
[2017-08-02 08:00:49] LOG.CRITICAL: Client error: `POST
https://commons.wikimedia.org/w/api.php` resulted in a `413 Request
Entity Too Large` response: <html> <head><title>413 Request Entity Too
Large</title></head> <body bgcolor="white"> <center><h1>413 Request
Entity (truncated...) [] []>
Or the program is not able to process huge files or, simply, the disk
space is finished.>
Kind regards
Sent from Mail[1] for Windows 10
*discussion list for Wikisource, the free library[3] *Subject: *[Wikisource-
l] IA Upload queue>
Dear all,
someone could help understand if we have an issue here?
https://tools.wmflabs.org/ia-upload/commons/init> Some librarians uploaded books months ago,
but they were never processed.
Is the tool working, or it simply never signals when it fails?
Mail priva di virus. www.avast.com[4]
_________________________________________________
Wikisource-l mailing list
https://lists.wikimedia.org/mailman/listinfo/wikisource-l
Links:

1. https://go.microsoft.com/fwlink/?LinkId=550986
2. mailto:***@gmail.com
3. mailto:wikisource-***@lists.wikimedia.org
4. https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient
Yann Forget
2017-09-12 12:07:06 UTC
Permalink
Raw Message
Hi,

The upload limit on Commons is 4 GB, so there should be any issue.
If the script can't upload files bigger than 100 MB, it should be fixed.
See how other tools work, as an example (i.e. video2commons).

Regards,

Yann
Yes, this happens when the resultant DjVu file is larger than Commons will
allow. I think 100 MB is the limit?
I'm not sure how to get around this. Perhaps we resize the images smaller?
But we don't want to do that every time, so perhaps we have to generate the
DjVu, see how big it is, and if it's too big resize and build it again?
Would that work?
We could make the over-size DjVu available for download, and then the user
could use a different method to upload to Commons (is there such a method?).
Suggestions welcome!
A related issue is https://phabricator.wikimedia.org/T161396
I can't find a ticket yet for the request-too-large problem, but I
remember seeing one; anyway, I'll create it again, and perhaps Community
Tech can look into it.
There's also the slight possibility that IA can start creating DjVus
again! Which would be brilliant, but I haven't heard anything about that
since Wikimania.
—Sam.
[2017-08-02 08:00:49] LOG.CRITICAL: Client error: `POST https://commons.wikimedia.org/w/api.php` <https://commons.wikimedia.org/w/api.php> resulted in a `413 Request Entity Too Large` response: <html> <head><title>413 Request Entity Too Large</title></head> <body bgcolor="white"> <center><h1>413 Request Entity (truncated...) [] []
Or the program is not able to process huge files or, simply, the disk space is finished.
Kind regards
Sent from Mail <https://go.microsoft.com/fwlink/?LinkId=550986> for
Windows 10
*Sent: *12 September 2017 10:37
*To: *discussion list for Wikisource, the free library
*Subject: *[Wikisource-l] IA Upload queue
Dear all,
someone could help understand if we have an issue here?
https://tools.wmflabs.org/ia-upload/commons/init
Some librarians uploaded books months ago,
but they were never processed.
Is the tool working, or it simply never signals when it fails?
<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
Mail priva di virus. www.avast.com
<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
<#m_2376373167041964652_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
*_______________________________________________*
Wikisource-l mailing list
https://lists.wikimedia.org/mailman/listinfo/wikisource-l
_______________________________________________
Wikisource-l mailing list
https://lists.wikimedia.org/mailman/listinfo/wikisource-l
Nicolas VIGNERON
2017-09-12 12:17:22 UTC
Permalink
Raw Message
Post by Yann Forget
Hi,
The upload limit on Commons is 4 GB, so there should be any issue.
If the script can't upload files bigger than 100 MB, it should be fixed.
See how other tools work, as an example (i.e. video2commons).
100 MB is the « regular » limit.
4 GB is the limit for chunked upload :
https://commons.wikimedia.org/wiki/Commons:Chunked_uploads (it's end on the
same place but it's not the same system)

Cdlt, ~nicolas
Sam Wilson
2017-09-12 12:24:46 UTC
Permalink
Raw Message
Ah yes!

So I think we just need to make ia-upload be better. :-)

https://www.mediawiki.org/wiki/API:Upload#Chunked_uploading
Post by Nicolas VIGNERON
Post by Yann Forget
Hi,
The upload limit on Commons is 4 GB, so there should be any issue.
If the script can't upload files bigger than 100 MB, it should
be fixed.>> See how other tools work, as an example (i.e. video2commons).
100 MB is the « regular » limit.
https://commons.wikimedia.org/wiki/Commons:Chunked_uploads (it's end
on the same place but it's not the same system)>
Cdlt, ~nicolas
_________________________________________________
Wikisource-l mailing list
https://lists.wikimedia.org/mailman/listinfo/wikisource-l
Sam Wilson
2017-09-12 12:23:09 UTC
Permalink
Raw Message
Oh right! I'm behind the times for sure. :-)

I guess it's because we don't support multi-part uploads then? Each
chunk must be smaller than some amount?
New bug now at: https://phabricator.wikimedia.org/T175680
Post by Yann Forget
Hi,
The upload limit on Commons is 4 GB, so there should be any issue.
If the script can't upload files bigger than 100 MB, it should
be fixed.> See how other tools work, as an example (i.e. video2commons).
Regards,
Yann
Post by Sam Wilson
__
Yes, this happens when the resultant DjVu file is larger than Commons
will allow. I think 100 MB is the limit?>>
I'm not sure how to get around this. Perhaps we resize the images
smaller? But we don't want to do that every time, so perhaps we have
to generate the DjVu, see how big it is, and if it's too big resize
and build it again? Would that work?>>
We could make the over-size DjVu available for download, and then the
user could use a different method to upload to Commons (is there such
a method?).>>
Suggestions welcome!
A related issue is https://phabricator.wikimedia.org/T161396
I can't find a ticket yet for the request-too-large problem, but I
remember seeing one; anyway, I'll create it again, and perhaps
Community Tech can look into it.>>
There's also the slight possibility that IA can start creating DjVus
again! Which would be brilliant, but I haven't heard anything about
that since Wikimania.>>
—Sam.
Post by Ilario Valdelli
[2017-08-02 08:00:49] LOG.CRITICAL: Client error: `POST
https://commons.wikimedia.org/w/api.php` resulted in a `413 Request
Entity Too Large` response: <html> <head><title>413 Request Entity
Too Large</title></head> <body bgcolor="white"> <center><h1>413
Request Entity (truncated...) [] []>>>
Or the program is not able to process huge files or, simply, the
disk space is finished.>>>
Kind regards
Sent from Mail[1] for Windows 10
*discussion list for Wikisource, the free library[3] *Subject: *[Wikisource-
l] IA Upload queue>>>
Dear all,
someone could help understand if we have an issue here?
https://tools.wmflabs.org/ia-upload/commons/init>>> Some librarians uploaded books months ago,
but they were never processed.
Is the tool working, or it simply never signals when it fails?
Mail priva di virus. www.avast.com[4]
_________________________________________________
Wikisource-l mailing list
https://lists.wikimedia.org/mailman/listinfo/wikisource-l
_______________________________________________
Wikisource-l mailing list
https://lists.wikimedia.org/mailman/listinfo/wikisource-l
_________________________________________________
Wikisource-l mailing list
https://lists.wikimedia.org/mailman/listinfo/wikisource-l
Links:

1. https://go.microsoft.com/fwlink/?LinkId=550986
2. mailto:***@gmail.com
3. mailto:wikisource-***@lists.wikimedia.org
4. https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient
Federico Leva (Nemo)
2017-09-12 12:09:51 UTC
Permalink
Raw Message
Post by Sam Wilson
I'm not sure how to get around this.
Probably by delegating the upload to a library which is able to use
chuncked uploading,
https://www.mediawiki.org/wiki/API:Upload#Chunked_uploading

Nemo
Loading...