


Var filesToAdd = ServiceDBHelper.GetTdxFilesToEnqueueForProcessingFromDB(numFilesToGet) Int numFilesToGet = _tdxParsersInstanceCount - _activeParserTasksDict.Count If (_queuedTdxFilesToParse.Count < _tdxParsersInstanceCount) While (_loadConcurrentQueueEvent.WaitOne()) It could be that both my enqueue and dequeue were both responsible and now it's addressed: var _loadQueueTask = (() => LoadQueue(), TaskCreationOptions.LongRunning) As opposed to potentially another task loading it with duplicate files. And for that thread I use an AutoResetEvent so that the queue is only populated only once at any instance of time. If (_activeParserTasksDict.Count (_queuedTdxFilesToParse.Distinct()) Īfter that I was still getting the occasional duplicate files so I moved the queue loading to another long running thread. I fixed this by generating Fire and Forget Tasks in a normal for loop as opposed to Paralle.For: private void SetParsersTask() So I foolishly threw in Parallel.For in addition to Task.Start which is already parallel. I was trying to create a situaion where private void SetParsersTask() would not be held by tasks that still needed to finish process a file. The solution was first to not add more parallelism than needs be. ¹ This answer was intended for a related question that is now deleted. Two powerful asynchronous tools is the Channel class and the Parallel.ForEachAsync API (available from. It can help at reducing the number of threads that your program uses while running, resulting in a more efficient and scalable program. Optionally you could consider familiarizing yourself with asynchronous programming. Some concepts and tools that you might want to research before attempting to rewrite this code: For me the whole approach is dubious, and needs to be reworked/scraped. I can't point to a specific error that needs to be fixed.
#Difference between stack and queue code#
So I am not surprised that your code is not working as expected. Calling and Task.Start without configuring the scheduler argument.If the queue is a ConcurrentQueue, there is no need to protect it because it is thread-safe by itself. If the queue is a Queue, you must protect it on each and every operation, otherwise the behavior of the program is undefined. Protecting a queue ( _queuedTdxFilesToParse) with a lock ( _concurrentQueueLock) only partially.This practically guarantees that the ThreadPool will get saturated. Launching a Parallel.For loop without configuring the MaxDegreeOfParallelism.Launching Task.Run tasks in a fire-and-forget fashion.I've never seen a problem solved with WaitHandles, that can't be solved in a simpler way without them. There is a number of red flags in this¹ code: _activeParserTasksDict.TryRemove(fileName, out Task taskToBeRemoved) Ĭan you guys help me understand why I am getting the same file dequeued in two different Tasks? And why I am getting more Tasks than the _ActiveTasksLimit? _activeParserTasksDict.TryAdd(fileName, startParserTask) Task startParserTask = new Task(() => ConfigureAndStartProcess(fileName)) String fileName = fileToProcess.TdxFileName This function is where I add and remove Tasks as well as launch the file parser for the dequeued file: private void LaunchTDXParser(TdxFileToProcessData fileToProcess)

All to make sure that I never run into a situation where two Tasks are dequeuing the same file. I even put a lock on _ConcurrentqueuedTdxFilesToParse even though I know it doesn't need one. If (!string.IsNullOrEmpty(fileToProcess.TdxFileName)) _ConcurrentqueuedTdxFilesToParse.TryDequeue(out fileToProcess) Which then calls this function which dequeues the Concurrent Queue: private void PrepTdxParser() If (_ConcurrentqueuedTdxFilesToParse.Count > 0) And make sure they are below _ActiveTasksLimit: private void SetParsersTask() This is why I use a concurrent Dictionary to track how many active tasks there are. The above function then calls SetParsersTask.
#Difference between stack and queue windows#
The _signalParseQueuedFilesEvent is set on a timer in a Windows Service While (_signalParseQueuedFilesEvent.WaitOne())

Rough code: private void ParseQueuedTDXFiles() I have even seen 8 tasks running at once which should not be happening. And I also get more tasks than are supposed to be allocated. However I run into issues where after some time, I start getting tasks that dequeue the same file at the same time (which leads to "used by another process errors on the file). These files are to be processed by parallel Tasks that will dequeue the files. Hi I have a concurrent Queue that is loaded with files from database.
