Repro sample : FileUploadSample (file://KK-MC2/FileUploadSample)
I have a simple custom buffer selector where UseBufferedInputStream returns true.
When I try to upload a 2GB fails, I am seeing a failure with below stack trace.
{"The stream does not support concurrent IO read or write operations."}
at System.Net.ConnectStream.InternalWrite(Boolean async, Byte[] buffer, Int32 offset, Int32 size, AsyncCallback callback, Object state)
at System.Net.ConnectStream.BeginWrite(Byte[] buffer, Int32 offset, Int32 size, AsyncCallback callback, Object state)
at System.Net.Http.StreamToStreamCopy.TryStartWriteSync(Int32 bytesRead)
at System.Net.Http.StreamToStreamCopy.BufferReadCallback(IAsyncResult ar)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at FileUploadSample.Program.<RunClient>d__0.MoveNext() in e:\TestProjects\FileUploadSample\Program.cs:line 72
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.AsyncMethodBuilderCore.<ThrowAsync>b__1(Object state)
at System.Threading.QueueUserWorkItemCallback.WaitCallback_Context(Object state)
at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem()
at System.Threading.ThreadPoolWorkQueue.Dispatch()
at System.Threading._ThreadPoolWaitCallback.PerformWaitCallback()
Comments: A couple notes here: 1) This isn't a valid user scenario. No one should be buffering a 2GB request body. 2) The exception that appears above is a stack trace for the client, not the server. I'm unable to repro the client issue, but it would be a problem with HttpClient and not Web API or OWIN. 3) I'm noticing a weird behavior though where even when an OutOfMemoryException gets thrown, the OWIN HttpListener server fully drains the request body before sending back a 500. This seems like it could expose users to a potential denial of service attack. I'll follow up with the Katana team.
I have a simple custom buffer selector where UseBufferedInputStream returns true.
When I try to upload a 2GB fails, I am seeing a failure with below stack trace.
{"The stream does not support concurrent IO read or write operations."}
at System.Net.ConnectStream.InternalWrite(Boolean async, Byte[] buffer, Int32 offset, Int32 size, AsyncCallback callback, Object state)
at System.Net.ConnectStream.BeginWrite(Byte[] buffer, Int32 offset, Int32 size, AsyncCallback callback, Object state)
at System.Net.Http.StreamToStreamCopy.TryStartWriteSync(Int32 bytesRead)
at System.Net.Http.StreamToStreamCopy.BufferReadCallback(IAsyncResult ar)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at FileUploadSample.Program.<RunClient>d__0.MoveNext() in e:\TestProjects\FileUploadSample\Program.cs:line 72
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.AsyncMethodBuilderCore.<ThrowAsync>b__1(Object state)
at System.Threading.QueueUserWorkItemCallback.WaitCallback_Context(Object state)
at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem()
at System.Threading.ThreadPoolWorkQueue.Dispatch()
at System.Threading._ThreadPoolWaitCallback.PerformWaitCallback()
Comments: A couple notes here: 1) This isn't a valid user scenario. No one should be buffering a 2GB request body. 2) The exception that appears above is a stack trace for the client, not the server. I'm unable to repro the client issue, but it would be a problem with HttpClient and not Web API or OWIN. 3) I'm noticing a weird behavior though where even when an OutOfMemoryException gets thrown, the OWIN HttpListener server fully drains the request body before sending back a 500. This seems like it could expose users to a potential denial of service attack. I'll follow up with the Katana team.