Trying to get to the bottom of an OutOfMemoryException I found that .net's BufferManagers, used by WCF's buffered TransferMode, were responsible for wasting literally hundreds of megabytes (see the question and my own answer on How can I prevent BufferManager / PooledBufferManager in my WCF client app from wasting memory? for details and how I could fix it by simply switching from 'buffered' to 'streamed').
Leaving aside WCF, BufferManagers were invented as a better performing alternative to what you would normally do: simply allocating byte arrays when you need them and relying on the GC to clean them up and recycle once the reference goes out of scope.
So my question is: Has anyone used BufferManagers in a real-world app so that it made a noticeable difference in terms of performance to justify the inconvenience of having to manually .Clear() the BufferManager (if that was necessary)?
And if so, could just manually creating a single byte buffer and keeping a reference to it not have solved that particular problem?
See Question&Answers more detail:os