Devide an IEnumerable in parts

By mortl92 on Jul 10, 2013

Use this if you have a collection of objects, and you need to devide this collection into many parts.

This is an example:

var testCollection = new List {"1", "2", "3", "4", "5", "6"};
var parts=testCollection.Devide(3);

for(int i=0;i<parts.Count();i++)
Console.WriteLine(string.Format("Part {0} -> {1}",i+1,string.Join(" - ",parts.ElementAt(i).ToArray())));

Part 1 -> 1 - 2
Part 2 -> 3 - 4
Part 3 -> 5 - 6

public static class EnumarableExtensions

    public static IEnumerable<IEnumerable<T>> Devide<T>(this IEnumerable<T> items,
                                       int partsNumber)
        var parts = items.Partition(items.Count() / partsNumber).ToList();
        var count = parts.Count();
        if (count > partsNumber)
            parts[0] = parts[0].Concat(parts[count - 1]);
            parts.RemoveAt(count - 1);
        return parts;

    public static IEnumerable<IEnumerable<T>> Partition<T>(this IEnumerable<T> items,
                                               int partitionSize)
        int i = 0;
        return items.GroupBy(x => i++ / partitionSize).ToArray();


Sign in to comment.
Sorasyn   -  Jul 10, 2013

What would be some practical applications of splitting an IEnumerable object? For the most part they're just fancy arrays used by foreach loops and other iteration algorithms to check all supplied items.

mortl92  -  Jul 17, 2013

Sorry for the late response. You are right with your remark. But anyway I had a good reason to use this functionality :) I had this problem: I had a generic of this type List that contains tausends of entries. I worked through this enumerable in a foreach loop, but this was too slow for me. So I wanted to parallelize the loop.
First I converted the list to the threadsave concurrentQueue, and I started 4 tasks, where each-one took one file after each other. The next approach was to run 4 Tasks (additionally I love the Tasks and the TaskFactory from .Net 4 framework) , where each task got a part of the enumerable to work through. This was the much faster aproach in my case (but it depends on the problem you have). Anyway I was using the code above :)

Sorasyn  -  Jul 17, 2013

Paralleling, I like it. :) I've been playing around with some paralleling techniques for tables stored as text documents in hopes of getting better performance all around. However, I'm getting somewhat bottle-necked loading the tables into memory. Anyways, nice job!

mortl92  -  Jul 17, 2013

If you are working in .net I can just recommend to use Framework 4. There you can use a lot of powerfull utilities all around paralelling. I like the new Task class a lot, before I was using always the default threads. Just a tip: if you have somewhere a slow foreach-loop or something else, give a look to the power of parallel-foraech (currently I'm using it a lot of times):
You will find a lot of other helpful functions for the Tasks :)

Sign in to comment

Are you sure you want to unfollow this person?
Are you sure you want to delete this?
Click "Unsubscribe" to stop receiving notices pertaining to this post.
Click "Subscribe" to resume notices pertaining to this post.