I have a table in Oracle 11g R2 that contains approximately 90,000 spatial (geographic) records. Hundreds of the records are duplicated due to bad practice of the users.
Is there anyway to measure the performance of the database/table before and after removing the duplicates?
A table with 90000 records is quite little table. Hundreds of duplicates is less then 1% - it is also quite little amount of a "garbage". This amount can't make big performance problems (if your application have a good design). I don't think that you can create tests that shows any significant difference in performance between "before" and "after".
Also you can delete duplicates and then create unique constraint to prevent such situation in future.