Beside using Cloudera Navigator, how can I find the less frequently accessed files, in HDFS.
How to find less frequenlty accessed files in HDFS
518 views Asked by Dipperman At
1
There are 1 answers
Related Questions in HIVE
- Type Adapter for Offset in hive flutter
- HIVE Sql Date conversion
- How to set spark.executor.extraClassPath & spark.driver.extraClassPath in hive query without adding those in hive-site.xml
- Hive query on HUE shows different timestamp than programatically/on data
- descending order of data in hive using collect_set
- How to optimize writing to a large table in Hive/HDFS using Spark
- Spark SQL repartition before insert operation
- Alter datatype of complex type(array<struct>>) in hive
- SqlAlchemy connection to Hive using http thrift transport and basic auth
- Aggregate values into a new column while retaining the old column
- Is it possible to query MAPR hdfs/hive tables from Trino?
- Can we make a column having both partitioning and bucketing in hive?
- converting varchar(7) to decimal (7,5) in hive
- Extract all characters before numeric values in hive SQL
- Livy session to submit pyspark from HDFS
Related Questions in HDFS
- Can anyoone help me with this problem while trying to install hadoop on ubuntu?
- ERROR: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "maprfs"
- How to optimize writing to a large table in Hive/HDFS using Spark
- Update hadoop hadoop-2.6.5 to haddop 3.x. Operation category WRITE is not supported in state standby
- Copy/Merge multiple HDFS files using Nifi Processor
- HDFS too many bad blocks due to "Operation category WRITE is not supported in state standby" - Understanding why datanode can't find Active NameNode
- distcp throws java.io.IOException when copying files
- ERROR flume.SinkRunner: Unable to deliver event
- Apache flume does not run hadoop 3.1.0 Flume 1.11
- Livy session to submit pyspark from HDFS
- ClickHouse Server Exception: Code: 210.DB::Exception: Fail to read from HDFS:
- Confluent HDFS Sink connector error while connecting HDFS to Hive
- Node Transitioned from NEW to UNHEALTHY and Attempting to remove non-existent node
- Error associated with Azure Datalake Gen2 and Hadoop connection
- How do I directly read files from HDFS using dask?
Related Questions in CLOUDERA-NAVIGATOR
- How to find less frequenlty accessed files in HDFS
- Lineage feature in Cloudera Navigator
- Does Cloudera Navigator Key Trustee Server ,Cloudera Navigator Key HSM , Cloudera Navigator Key Trustee KMS necessary for Impala installation?
- cloudera navigator insert overwrite directory
- Issue searching file names with spaces in Cloudera Navigator API
- Where to check LDAP query from Cloudera Navigator?
- cloudera navigator multi-tenancy capability
- Tuning cloudera navigator
- Cloudera Navigator API fail to fetch nested data
- Cloudera Navigator Performance evaluation
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Popular Tags
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
I assume that you are looking for the time a file was last accessed (open, read, etc.), because as longer in the past the file would be less accessed.
Whereby you can do this in Linux quite simple via ls -l -someMoreOptions, in HDFS more work is necessary.
Maybe you could monitor the
/hdfs-audit.logforcmd=openof the mentioned file. Or you could implement a small function to read out theFileStatus.getAccessTime()and as mentioned under Is there anyway to get last access time of HDFS files? or How to get last access time of any files in HDFS? in Cloudera Community.In other words, it will be necessary to create a small program which scans all the files, read out the properties
and order it. It that you will be able find files which were not accessed for long times.