Thursday, July 8, 2010

Azure big upload take 2

Since posting the last big upload post we realised there was actually a 100MB upload limit in Azure.  This isn't 100MB per user.. this is 100MB per server.  So if 2 users upload 55MB each say, they will most likely both fail.  I think this is because the location for the temporary files only has 100MB available.  Sure, you can configure another location in web.config but that's not really possible when you can only get a writable location at runtime.  We emailed Azure support and it's been about a week with only this response so far:
"...So far the problem you have reported is a known bug and there is no fix for the issue besides the following workarounds:

  1. Use 3rd party ASP component which does not use asp.net temp folder

  2. Use silverlight

  3. Create your own asp.net upload component which does not use asp.net temp folder..."



So.. where to now!?  I did a bit of research and came across a great post that allows you to bypass asp.net upload handling -  http://dimebrain.com/2010/01/large-or-asynchronous-file-uploads-in-asp-net-mvc.html.  The only problem with this solution is that it doesn't appear to handle the form fields which we also need.  I actually tried it too and found it threw an exception pretty early in the piece and because it didn't quite cover what we needed I made a version which handles any number of files, a file size limit (although this only applies to the size of the stream itself) and any number of form fields.

I also tried to make this in an ActionFilter, eg ...
[BigUpload(100)]
public ActionResult Upload()

... but it appeared that using an ActionFilter was actually forcing asp.net to start handling the stream itself which is what we didn't want.  I'm not 100% sure whether it was the ActionFilter that caused this or whether it was some call inside the ActionFilter that caused it to deal with the stream itself.

So the solution.  First the usage:
public ActionResult Upload() {
// I have a 20GB local resource configured for this web role called "UploadStorage"
var temporaryStorageLocation = RoleEnvironment.GetLocalResource("UploadStorage").RootPath;
// work with the underlying connection..
var context = ControllerContext.HttpContext;
var provider = (IServiceProvider)context;
var workerRequest = (HttpWorkerRequest)provider.GetService(typeof(HttpWorkerRequest));

var processor = new BigUploadProcessor(workerRequest, temporaryStorageLocation);
processor.UploadLength = 100 * 1024 * 1024; //100MB
processor.HandleRequest(context, context.Request.ContentEncoding);

// at this point you can now use processor.Fields["fieldName"] or processor.Files["fieldName"]

Then you just need the guts: BigUpload .  I've tested it in firefox, chrome and internet explorer and haven't had any issues but please comment if you try it and run into any problems.

This is provided for information only, use at your own risk.

1 comment:

  1. [...] uploading big files into blob storage Update: Please see my updated azure big files [...]

    ReplyDelete