Hi...
I have a table (fctTable), which references another table (refTable) that has a parent/child relationship to it's self.
What i need to do is for each row in the fctTable i need to get the sum of a column on the refTable for whole tree of parents.
this is hard to explain so i have an example...
Here are my two tables:
fctTable
refTable
Id | ParentId | Value |
---|
1 | 0 | 17 |
2 | 1 | 11 |
3 | 0 | 23 |
4 | 5 | 5 |
5 | 3 | 7 |
7 | 2 | 13 |
8 | 0 | 3 |
9 | 8 | 19 |
and i need to get this out:
Result
fctTableId | (refTableIds) | SUM(refTable.Value) |
---|
1 | 2,1 | 11 + 17 = 28 |
2 | 7,2,1 | 13 + 11 + 17 = 41 |
3 | 4,5,3 | 5 + 7 + 23 = 35 |
4 | 9,8 | 19 + 3 = 22 |
5 | 8 | 3 = 3 |
I don't need the refTableId's column in the results, thats just to help me work out what the heck i'm doing....
I can do it by creating a table valued function similar to the SP mentioned by Anish M
here, which gets the tree for a given refTableId, and then calling that from a select statment that chooses the rows in the fctTable i want to use.
but that seems clunky and slow (i may have millions of records in the fctTable to do this for so slow ≡ bad)
I was wondering if any one can think of a better solution... any ideas?