Given the schema R = (A, B, C, D, E, H, I) with the functional dependency set F = {A→B, C→D, CD→E, BD→AH, H→D, AC→H}:
Is the decomposition of R into (A, B, C, D, I) and (B, C, E, H) dependency preserving?
I examined examples with 4-5 elements and tried to solve my example that way, but it did not work. I don't know how to proceed because the number of elements and relationships is high.
How should I approach this question mathematically?
The answer is no, since the functional dependencies
H -> DandBD -> AHare not preserved. A way to check this is to project the dependenciesFover the two schemas, let's call those projectionsF1andF2, then see if all the dependencies ofFcan be derived fromF1+ U F2+. In this case we can see that the two above dependencies cannot be derived.