Problem with consecutive operations

Jan 30, 2008 at 6:00 PM
Hi all,
I am writting an application that needs to continously upload files in a bucket.
The problem is that after some uploads (or any other operations to the bucket), I can't do any operation with that bucket, all of them time out.
I found out about ConnectionLimit, but I do not need or do any concurrent operaation, all my operations are done serially.
Is there something that I am missing?
Thanks in advance!
Jan 31, 2008 at 7:42 PM
I solved my problem!
I was checking checking for a temporary redirect (for EU buckets) with a BucketListRequest and HEAD method, as in FormSample::UploadFile.
I am only guessing that the HEAD mothod is cassing the problem, as I am not an http expert.
After I removed this check everything works fine.
I can live with that, since a temporary redirect exists only for few minutes after bucket creation.
I suppose I could use GET method instead for temporary redirects, but a better solution would be desirable.
Mar 11, 2008 at 11:01 PM
I have experienced a similar issue and I do not think the problem is with the HEAD.

My guess is that HttpWebResponse in Query.ThreeSharpQuery.Invoke<T> is never closed, so the connection is not released immediately and at some point you run out of connections. Effectively closing DataStream as suggested in the documentation does not release the connection on HttpWebResponse unless KeepAlive = false.

IMHO, the library requires a fix either to close the HttpWebResponse properly or to set KeepAlive = false on the HttpWebRequest.
See http://support.microsoft.com/kb/915599 for more information.

What is your opinion?
Mar 17, 2008 at 12:36 PM
I tried the fix you are suggesting with no luck, back when I was debugging this.
I do not remember if I was experiencing the same behaviour or if something else was not working after the fix: httpWebRequest.KeepAlive = false; in ConfigureWebRequest.
I will look into this in more detail when I find some time.
I agree that the HEAD method should not be the problem!
Coordinator
Mar 18, 2008 at 5:10 PM
Hi,
Can you give some more details? What size of files? How many before you have problems? Are you doing the temporary redirect check before every upload, or only before the first one?

Thanks,
Joel Wetzel
Affirma Consulting
Mar 18, 2008 at 10:33 PM
Edited Mar 18, 2008 at 10:34 PM
Hi,

The tests I have run are with small files for the moment: 2K to 10K.
I was doing the "temporary redirect check" before every upload and the 9th or 10th file (I do not remember which) would consistently fail.
I am now doing the "temporary redirect check" only once before the first upload in a series and I no more hit the connection limit with my unit tests. I would have to run more tests before drawing any conclusion though.

Regards,
Jacques
Mar 18, 2008 at 11:26 PM
Edited Mar 18, 2008 at 11:27 PM
Hi all,

have been dealing with this issue also.

I seem to be able to get this issue when I am trying to get the MetaData using the "HEAD" - I created an ObjectHeadRequest.cs which contains:
public ObjectGetHeadRequest(String bucketName, String key)
{
this.Method = "HEAD";

this.BucketName = bucketName;
this.Key = key;
}
I have set up a loop to send the same file 100 times (32kb) to the S3 server. If I do not call the above, I can get 100 transfers. If I do call the above following, I only get 9 or 10. Very simple to reproduce, and as the Reponse is just derived from the "Response", the MetaData is successfully returned. This seems to be pointing at an issue with the httpWebRequest not closing successfully on MetaData. I have made sure all responses call "Close" on the DataStream.

Thanks


Neil
Coordinator
Mar 19, 2008 at 4:38 AM
Ah, that might be it. If you don't close all DataStreams, there will DEFINITELY be problems. The next version is going to implement IDisposable, so that you'll be able to use the "using" clause. This will ensure all DataStreams get closed, no matter what.

Thanks,
Joel Wetzel
Affirma Consulting