Databricks metastore error - Connection reset

390 views Asked by At

Upon my Databricks setup in Azure cloud, first i am trying to execute below query in my ADB Notebook,

spark.sql("CREATE CATALOG IF NOT EXISTS QUICKSTART_CATALOG")

But i'm getting below error. Could not able to fix, full error log shared below? Any issue in my databricks setup or anything missing here?

Error java.lang.Exception: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163) at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115) at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153) at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:390) at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34) at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152) at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:313) at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:263) at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:253) at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:62) at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$resourceLoader$1(HiveSessionStateBuilder.scala:69) at org.apache.spark.sql.hive.HiveSessionResourceLoader.client$lzycompute(HiveSessionStateBuilder.scala:167) at org.apache.spark.sql.hive.HiveSessionResourceLoader.client(HiveSessionStateBuilder.scala:167) at org.apache.spark.sql.hive.HiveSessionResourceLoader.$anonfun$addJar$1(HiveSessionStateBuilder.scala:171) at org.apache.spark.sql.hive.HiveSessionResourceLoader.$anonfun$addJar$1$adapted(HiveSessionStateBuilder.scala:170) at scala.collection.immutable.List.foreach(List.scala:431) at org.apache.spark.sql.hive.HiveSessionResourceLoader.addJar(HiveSessionStateBuilder.scala:170) at org.apache.spark.sql.execution.command.AddJarsCommand.$anonfun$run$1(resources.scala:38) at org.apache.spark.sql.execution.command.AddJarsCommand.$anonfun$run$1$adapted(resources.scala:36) at scala.collection.immutable.Stream.foreach(Stream.scala:533) at org.apache.spark.sql.execution.command.AddJarsCommand.run(resources.scala:36) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:229) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:249) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:399) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:194) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985) at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:148) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:349) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:229) at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:214) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:227) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:220) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:298) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:294) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488) at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:220) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:354) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:220) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:174) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:165) at org.apache.spark.sql.Dataset.<init>(Dataset.scala:239) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:108) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:105) at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:820) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:815) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:695) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$6(DriverLocal.scala:279) at org.apache.spark.SafeAddJarOrFile$.safe(SafeAddJarOrFile.scala:31) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$5(DriverLocal.scala:279) at com.databricks.sql.acl.CheckPermissions$.trusted(CheckPermissions.scala:1808) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$4(DriverLocal.scala:278) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:99) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$3(DriverLocal.scala:271) at scala.util.Using$.resource(Using.scala:269) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$2(DriverLocal.scala:270) at scala.collection.Iterator.foreach(Iterator.scala:943) at scala.collection.Iterator.foreach$(Iterator.scala:943) at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at com.databricks.backend.daemon.driver.DriverLocal.<init>(DriverLocal.scala:257) at com.databricks.backend.daemon.driver.PythonDriverLocalBase.<init>(PythonDriverLocalBase.scala:168) at com.databricks.backend.daemon.driver.JupyterDriverLocal.<init>(JupyterDriverLocal.scala:381) at com.databricks.backend.daemon.driver.PythonDriverWrapper.instantiateDriver(DriverWrapper.scala:723) at com.databricks.backend.daemon.driver.DriverWrapper.setupRepl(DriverWrapper.scala:342) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:231) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.Throwable: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1169) at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1154) at org.apache.spark.sql.hive.client.Shim_v0_12.databaseExists(HiveShim.scala:619) at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$databaseExists$1(HiveClientImpl.scala:440) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:337) at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:236) at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:274) at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:228) at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:317) at org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala:440) at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$databaseExists$1(PoolingHiveClient.scala:321) at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$databaseExists$1$adapted(PoolingHiveClient.scala:320) at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149) at org.apache.spark.sql.hive.client.PoolingHiveClient.databaseExists(PoolingHiveClient.scala:320) at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:313) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80) at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154) ... 79 more Caused by: java.lang.Throwable: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465) at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1165) ... 97 more Caused by: java.lang.Throwable: null at sun.reflect.GeneratedConstructorAccessor836.newInstance(null) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410) ... 102 more Caused by: java.lang.Throwable: Error creating transactional connection factory at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:671) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:830) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:334) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:213) at sun.reflect.GeneratedMethodAccessor322.invoke(null) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at java.security.AccessController.doPrivileged(Native Method) at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:331) at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:360) at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:269) at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:244) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:79) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:139) at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58) at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59) at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171) ... 106 more Caused by: java.lang.Throwable: null at sun.reflect.GeneratedConstructorAccessor894.newInstance(null) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606) at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:330) at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:203) at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:162) at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:285) at sun.reflect.GeneratedConstructorAccessor893.newInstance(null) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606) at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301) at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133) at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:422) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:817) ... 134 more Caused by: java.lang.Throwable: Attempt to invoke the "HikariCP" plugin to create a ConnectionPool gave an error : Failed to initialize pool: Could not connect to address=(host=consolidated-eastindia-prod-metastore.mysql.database.azure.com)(port=3306)(type=master) : Could not connect to consolidated-eastindia-prod-metastore.mysql.database.azure.com:3306 : Connection reset at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:232) at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:117) at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:82) ... 150 more Caused by: java.lang.Throwable: Failed to initialize pool: Could not connect to address=(host=consolidated-eastindia-prod-metastore.mysql.database.azure.com)(port=3306)(type=master) : Could not connect to consolidated-eastindia-prod-metastore.mysql.database.azure.com:3306 : Connection reset at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:512) at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:105) at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:71) at org.datanucleus.store.rdbms.connectionpool.HikariCPConnectionPoolFactory.createConnectionPool(HikariCPConnectionPoolFactory.java:176) at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:213) ... 152 more Caused by: java.lang.Throwable: Could not connect to address=(host=consolidated-eastindia-prod-metastore.mysql.database.azure.com)(port=3306)(type=master) : Could not connect to consolidated-eastindia-prod-metastore.mysql.database.azure.com:3306 : Connection reset at org.mariadb.jdbc.internal.util.exceptions.ExceptionFactory.createException(ExceptionFactory.java:73) at org.mariadb.jdbc.internal.util.exceptions.ExceptionFactory.create(ExceptionFactory.java:197) at org.mariadb.jdbc.internal.protocol.AbstractConnectProtocol.connectWithoutProxy(AbstractConnectProtocol.java:1394) at org.mariadb.jdbc.internal.util.Utils.retrieveProxy(Utils.java:635) at org.mariadb.jdbc.MariaDbConnection.newConnection(MariaDbConnection.java:150) at org.mariadb.jdbc.Driver.connect(Driver.java:89) at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:95) at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:101) at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:341) at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:506) ... 156 more Caused by: java.lang.Throwable: Could not connect to consolidated-eastindia-prod-metastore.mysql.database.azure.com:3306 : Connection reset at org.mariadb.jdbc.internal.util.exceptions.ExceptionFactory.createException(ExceptionFactory.java:73) at org.mariadb.jdbc.internal.util.exceptions.ExceptionFactory.create(ExceptionFactory.java:188) at org.mariadb.jdbc.internal.protocol.AbstractConnectProtocol.createConnection(AbstractConnectProtocol.java:575) at org.mariadb.jdbc.internal.protocol.AbstractConnectProtocol.connectWithoutProxy(AbstractConnectProtocol.java:1389) ... 163 more Caused by: java.lang.Throwable: Connection reset at java.net.SocketInputStream.read(SocketInputStream.java:210) at java.net.SocketInputStream.read(SocketInputStream.java:141) at java.io.FilterInputStream.read(FilterInputStream.java:133) at org.mariadb.jdbc.internal.io.input.ReadAheadBufferedStream.fillBuffer(ReadAheadBufferedStream.java:131) at org.mariadb.jdbc.internal.io.input.ReadAheadBufferedStream.read(ReadAheadBufferedStream.java:104) at org.mariadb.jdbc.internal.io.input.StandardPacketInputStream.getPacketArray(StandardPacketInputStream.java:247) at org.mariadb.jdbc.internal.io.input.StandardPacketInputStream.getPacket(StandardPacketInputStream.java:218) at org.mariadb.jdbc.internal.com.read.ReadInitialHandShakePacket.<init>(ReadInitialHandShakePacket.java:89) at org.mariadb.jdbc.internal.protocol.AbstractConnectProtocol.createConnection(AbstractConnectProtocol.java:527)

1

There are 1 answers

0
DileeprajnarayanThumula On

The ERROR java.lang.Exception: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient Is related to various possible reasons

  • The Hive Metastore database is not available.
  • The Hive Metastore database is not active.
  • The configuration file for Hive is wrong.
  • The schematool command hasn't been executed to set up the Hive Metastore database.
  • The issue could be related to a firewall that was blocking us to perform certain commands.

You can resolve this by following the below:

Step 1: Make changes to the Hive settings.

javax.jdo.option.ConnectionURL: This setting tells Hive how to connect to its metadata storage (Metastore) using a JDBC URL. By default, it's set to jdbc:derby:;databaseName=metastore_db;create=true.

javax.jdo.option.ConnectionDriverName: This is about the JDBC driver class used for the Metastore database. The default is

org.apache.derby.jdbc.EmbeddedDriver.

hive.metastore.warehouse.dir: This points to the directory where Hive stores its data warehouse.

Step 2: You can Backup and delete the Hive Metastore directory

mv -rf $HIVE_HOME/metastore_db $HIVE_HOME/metastore_db_bk

Step 3: Set up the Metastore database Execute the schematool command to establish the Metastore database. run the following command:

schematool -initSchema -dbType derby

This command creates the Metastore database if it is not already there and sets up the necessary database structure.

Reference:

As you mentioned that you are trying to create catalog in the Azure databricks I have tried the below ways at my end and I am getting the below errors.

%sql
CREATE CATALOG IF NOT EXISTS Dileep_example_catalog;

Error in SQL statement: AnalysisException: [UC_NOT_ENABLED] Unity Catalog is not enabled on this cluster.

spark.sql("CREATE CATALOG IF NOT EXISTS example")

AnalysisException: [UC_NOT_ENABLED] Unity Catalog is not enabled on this cluster.

%scala
spark.sql("CREATE CATALOG IF NOT EXISTS example")

AnalysisException: [UC_NOT_ENABLED] Unity Catalog is not enabled on this cluster.

Using the SQL editor:

CREATE CATALOG IF NOT EXISTS Dileep_example_catalog;

[UC_NOT_ENABLED] Unity Catalog is not enabled on this cluster.

Note: creating catalog Applies to: Databricks SQL Databricks Runtime 10.3 and above Unity Catalog only

If you are using the Unity catalog. This article will help you Know more about Creating a Unity Catalog metastore

I have tried the Below example to create databases Under default Hive metastore Catalog.

%sql
CREATE DATABASE IF NOT EXISTS Database02;
USE Database02;


CREATE TABLE IF NOT EXISTS people (
  Name STRING,
  Age INT
);

%sql
CREATE TABLE default.people02
(
 Name STRING
);

enter image description here