Environment

Runtime Information

NameValue
Java Home/home/sdata/jdk/jre
Java Version1.8.0_151 (Oracle Corporation)
Scala Versionversion 2.12.11

Spark Properties

NameValue
hive.exec.dynamic.partition.modenonstrict
mapreduce.job.run-localtrue
spark.app.idlocal-1740995754295
spark.app.nameSparkEngine
spark.app.startTime1740995748506
spark.cleaner.periodicGC.interval2h
spark.driver.extraJavaOptions-Djava.net.preferIPv6Addresses=false -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/jdk.internal.ref=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED -Djdk.reflect.useDirectMethodHandle=false
spark.driver.hostecm-5116
spark.driver.port45888
spark.executor.extraJavaOptions-Djava.net.preferIPv6Addresses=false -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/jdk.internal.ref=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED -Djdk.reflect.useDirectMethodHandle=false
spark.executor.iddriver
spark.hadoop.mapreduce.job.run-localtrue
spark.masterlocal[8]
spark.memory.fraction0.4
spark.mongodb.input.partitionerMongoPaginateByCountPartitioner
spark.mongodb.input.partitionerOptions.numberOfPartitions10
spark.mongodb.input.partitionerOptions.partitionKey_id
spark.mongodb.input.sampleSize2000
spark.mongodb.input.urimongodb://sdata:3er4#ER$@192.168.1.109:27017,192.168.1.109:27018/sdata_db.sdata_coll?replicaSet=MyMongo
spark.mongodb.output.urimongodb://sdata:3er4#ER$@192.168.1.109:27017,192.168.1.109:27018/sdata_db.sdata_coll?replicaSet=MyMongo
spark.port.maxRetries100
spark.rdd.compresstrue
spark.scheduler.modeFIFO
spark.shuffle.consolidateFilestrue
spark.sql.analyzer.maxIterations500
spark.sql.caseSensitivetrue
spark.sql.crossJoin.enabledfalse
spark.sql.legacy.timeParserPolicyLEGACY
spark.sql.parquet.cacheMetadatafalse
spark.sql.parquet.int96RebaseModeInWriteCORRECTED
spark.sql.shuffle.partitions200
spark.sql.sources.partitionOverwriteModedynamic

Resource Profiles

Resource Profile IdResource Profile Contents
0
Executor Reqs:
	cores: [amount: 1]
	memory: [amount: 1024]
	offHeap: [amount: 0]
Task Reqs:
	cpus: [amount: 1.0]

Hadoop Properties

System Properties

Metrics Properties

Classpath Entries