Spark JDBC - Read/update/write huge table without int/long primary key

145 views Asked by At

I am trying to update certain columns in a big MySQL table that does not have any primary key.

How can I handle such big tables if it's size, e.g. 6GB, and my executor memory is only 2GB?

Do you think, Spark-ODBC would help me somehow?

If I would create the table, it would have a primary key:

CREATE TABLE Persons (
    Personid varchar(255) NOT NULL,
    LastName varchar(255) NOT NULL,
    FirstName varchar(255) DEFAULT NULL,
    Email varchar(255) DEFAULT NULL,
    Age int(11) DEFAULT NULL,
    PRIMARY KEY (Personid)
) ENGINE = InnoDB DEFAULT CHARSET = latin1;
0

There are 0 answers