I have a table with millions of rows:
CREATE TABLE [dbo].[RequestIdentities]
(
[Id] [bigint] IDENTITY(1,1) NOT NULL,
[UniqueKey] [nvarchar](256) NULL,
[Timestamp] [datetime] NULL,
CONSTRAINT [PK_RequestIdentities]
PRIMARY KEY CLUSTERED ([Id] ASC)
WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF,
IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON,
ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[RequestIdentities]
ADD CONSTRAINT [DF_RequestIdentities_Timestamp]
DEFAULT (GETDATE()) FOR [Timestamp]
GO
A Web API manipulates the database using ADO.NET and does:
Executes this query:
SELECT 1 FROM RequestIdentities WITH (nolock) WHERE UniqueKey = @key
If exists:
if(reader.HasRows)
Returns a http response.
Otherwise, it inserts the id into the table:
INSERT INTO RequestIdentities(UniqueKey) VALUES(@key)
There are hundreds of inserts/updates every minutes, is there something I can do to optimize the table, like custom statistics/indexes?
After having do this job, you can use the following query :
Aand returns the OUTPUT result clause into a table to your client application