how can i show a language is context sensitive with a non deterministic turing machine?
i know that a language that is accepted by a Linear bound automaton (LBA ) is a context -sensitive language. And a LBA is a non-deterministic turing machine.
Any idea how can i relate all these and show that a language is context sensitive?
As templatetypedef's answer has some flaws (which I will point out in a second in a comment), I give a quick answer to your question:
The language is context sensitive if (and only if) you can give a nondeterministic turing machine using linear space that defines L.
Let L = { a^n b^n a^n } for an arbitrary integer n; a^n here means n concatenations of the symbol a. This is a typical context sensitive language. Instead of giving a CSG, you can give a LBA to show that L is context sensitive:
The turing machine M 'guesses' (thanks to nondeterminism) n [in other words you may say 'every branch of the nondeterministic search tree tries out another n], and then checks whether the input matches a^n b^n a^n. You need log n cells to store n, the matching might need (if implemented trivially) another log n cells. As n + 2log n < 2n, this machine needs only linear space, and is therefore an LBA, hence L is context sensitive.