I have discovered a large memory leak when using xFile. As per documentation, page that has xFile calls progress page that starts upload to client. The function callback below performs this. The opener page that has an iterator to process upload, has garbage cleanup that does not release memory. Over time during large uploads the web server exhausts memory and has to reset (during large uploads) If session dies, memory is released. <SCRIPT language="VBS">Sub Callback() AXFFileProgress.XFRequestStream = opener.Form.all("AXFFileDownload").XFRequestStream AXFFileProgress.ShowProgress 'Response.AddHeader "Content-Length", opener.document.all("AXFFileDownload").file.Size opener.Form.all("AXFFileDownload").Start opener.Form.all("AXFFileDownload").reset window.closeEnd Sub </SCRIPT>I will spare you the entire iteration loop and only include the definition of the objects at the beginning and the disposal at the end. Dim oFile As SaFile Dim oFileUp As New FileUp(Me.Context) Dim iterator As SaUploadDictionaryEnum iterator = CType(oFileUp.Form.GetEnumerator(), SaUploadDictionaryEnum) 'destroy destroy destroy oFileUp.Dispose() oFileUp = Nothing oFile = Nothing iterator = NothingFunny thing is that as soon as the start is called and response written to the opener, the memory use grows even if I comment the entire iterator function.How do I get this page to release the memory?
XFile runs entirely on the client so it would not be able to affect memory on the server. FileUp runs on the server so that is what we should look at. There were some small memory leaks discovered in old versions of FileUp but they were all fixed. And even those could not have caused a significant problem within a single upload. FileUp doesn't use much memory at all during an upload. The HttpModule intercepts the request as it comes in and caches the file(s) to disk. FileUp's object model gets populated with metadata about the file but doesn't store the file itself in memory. I'm suspecting that perhaps you don't have the HttpModule configured properly, in which case .NET will read in the entire request before FileUp can access it. Older versions of .NET (1.0 and 1.1) used to cache the entire request to memory, which could cause behavior like what you are seeing. .NET 2 and above are more memory efficient but they still read in the entire request before your page-level code even executes. Regardless of which version of .NET you are using, our HttpModule will completely bypass .NET's caching of the request, so using the HttpModule is best for performance.
To help troubleshoot the problem, can you please answer these questions:
Version of safileup is 18.104.22.168os is server 2003 r2.net is 2.0.50727
Fileup is being used in a control (.ascx) that is within content management package dotNetNuke.If I follow all instructions for setting handler for .uplx, I get "failed to execute url" error.
<add name="uplxForSs" path="*.uplx" verb="*" modules="IsapiModule" scriptProcessor="C:\Windows\Microsoft.NET\Framework\v2.0.50727\aspnet_isapi.dll" resourceType="Unspecified" preCondition="classicMode,runtimeVersionv2.0,bitness32" />
Dev box is windows 7.
Are the version numbers and paths you are using in your web.config correct for your environment?
Is the user control being used in a .uplx page? Even after you get the web.config and IIS settings set up correctly, you will still need the .uplx extension. I'm not sure how easy that is to do in the CMS, but I believe we have had customers use FileUp with dotNetNuke in the past. Here are a couple of old threads about FileUp and dotNetNuke, but there wasn't a clear resolution expressed in either of them:
As explained in one of the threads, FileUpEE has a customizable extension, but FileUp SE and PE require the .uplx extension.