|
Let's start by formatting the code and adding ;line numbers so it is actually readable. You can now clearly see that at the very least you have invalid lines at lines numbers 5 and 22. I assume these are supposed to be comment lines so require a "//" at the beginning. You also have mixed capitalisation on your setters and getters.
So start by fixing those and see what happens.
1 public class student{
2 private int ID;
3 private String Name;
4
5 Student class constructor
6 student(int id, String name){
7 this.ID = id;
8 this.Name = name;
9 }
10 public int getid(){
11 return this.ID;
12 }
13 public String Getname(){
14 return this.Name;
15 }
16 public void SETid(int i){
17 this.ID = i;
18 }
19 public void sETNAme(String n){
20 this.Name=n;
21 }
22 method to display data
23 public void display() {
24 System.out.println("Student id is: " + id + " "
25 + "and Student name is: "
26 + Name;
27 );
28 System.out.println();
29 }
30 }
|
|
|
|
|
You forget to add // on commenting lines that is in line number 5 and 22
public class student{
private int ID;
private String Name;
student(int id, String name){
this.ID = id;
this.Name = name;
}
public int getid(){
return this.ID;
}
public String Getname(){
return this.Name;
}
public void SETid(int i){
this.ID = i;
}
public void sETNAme(String n){
this.Name=n;
}
public void display() {
System.out.println("Student id is: " + id + " "
+ "and Student name is: "
+ Name;
);
System.out.println();
}
}
|
|
|
|
|
See my answer of 9th June above.
|
|
|
|
|
He already did, when he cut'n'pasted it
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Hello. I'm new to Java -- I want to learn, and am going through HFJ. I know it's an old book, but there are no later editions, so I may be behind the times.
I'm not a professional programmer, but I'm on a few programming forums, and I can't seem to get an answer to a question I have regarding the Runnable interface. I'm in chap 15 of the book re: threading, and I came across this statement in regards to starting a new thread:
public class MyRunnable implements Runnable
Runnable threadJob = new MyRunnable ()
What I don't understand is how can you make a new object with an interface, but use the class MyRunnable. Shouldn't it be MyRunnable thread = new MyRunnable ()?
I'm just confused. Runnable isn't a class, so what's going on?
|
|
|
|
|
|
Any class can be "cast" to an interface that it implements; or it can be cast to any object it is "derived" from (e.g. "object").
In your example, a new MyRunnable is created and implicitly "down cast" to Runnable.
The interface acts like a proxy for all (different) classes that implement that interface; allowing only access to the properties / methods identified via the interface.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
In C++ terms, a Java interface is a pure abstract class that only has member functions. No fields.
Fields on interfaces are always “constants”. (This is an older Java rule)
It is closer to the COM interface concept.
It could also be called an API contract.
An object can implement multiple interfaces.
This is an easy way to create delayed binding. The thread library was written decades ago, but you can create an Object today that will be usable by this old library as long as you honor the contract defined by the library. For Runnable, you must provide the run() method. The live Thread object will then invoke your run() method when you get your plumbing code correct.
|
|
|
|
|
connecting morpho biometric device with java in netbeans 8.2 and i want source code
|
|
|
|
|
Sorry, this site does not provide code to order.
|
|
|
|
|
hi am having following issue when starting kafta
confluent local services ksql-server start
The local commands are intended for a single-node development environment only,
NOT for production usage. https:
Using CONFLUENT_CURRENT: /tmp/confluent.122197
ZooKeeper is [UP]
Kafka is [UP]
Schema Registry is [UP]
Starting ksqlDB Server
Error: ksqlDB Server failed to start
kafta@kafta-VirtualBox:~$ confluent local services ksql-server log
The local commands are intended for a single-node development environment only,
NOT for production usage. https:
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
[2022-05-24 11:22:13,133] INFO KsqlConfig values:
ksql.access.validator.enable = auto
ksql.authorization.cache.expiry.time.secs = 30
ksql.authorization.cache.max.entries = 10000
ksql.cast.strings.preserve.nulls = true
ksql.connect.basic.auth.credentials.file =
ksql.connect.basic.auth.credentials.reload = false
ksql.connect.basic.auth.credentials.source = NONE
ksql.connect.error.handler = null
ksql.connect.request.headers.plugin = null
ksql.connect.url = http:
ksql.connect.worker.config =
ksql.create.or.replace.enabled = true
ksql.error.classifier.regex =
ksql.extension.dir = ext
ksql.headers.columns.enabled = true
ksql.hidden.topics = [_confluent.*, __confluent.*, _schemas, __consumer_offsets, __transaction_state, connect-configs, connect-offsets, connect-status, connect-statuses]
ksql.insert.into.values.enabled = true
ksql.internal.topic.min.insync.replicas = 1
ksql.internal.topic.replicas = 1
ksql.lambdas.enabled = true
ksql.metastore.backup.location =
ksql.metrics.extension = null
ksql.metrics.tags.custom =
ksql.nested.error.set.null = true
ksql.output.topic.name.prefix =
ksql.persistence.default.format.key = KAFKA
ksql.persistence.default.format.value = null
ksql.persistence.wrap.single.values = null
ksql.persistent.prefix = query_
ksql.properties.overrides.denylist = []
ksql.pull.queries.enable = true
ksql.query.cleanup.shutdown.timeout.ms = 30000
ksql.query.error.max.queue.size = 10
ksql.query.persistent.active.limit = 2147483647
ksql.query.persistent.max.bytes.buffering.total = -1
ksql.query.pull.consistency.token.enabled = false
ksql.query.pull.enable.standby.reads = false
ksql.query.pull.interpreter.enabled = true
ksql.query.pull.limit.clause.enabled = true
ksql.query.pull.max.allowed.offset.lag = 9223372036854775807
ksql.query.pull.max.concurrent.requests = 2147483647
ksql.query.pull.max.hourly.bandwidth.megabytes = 2147483647
ksql.query.pull.max.qps = 2147483647
ksql.query.pull.metrics.enabled = true
ksql.query.pull.range.scan.enabled = true
ksql.query.pull.router.thread.pool.size = 50
ksql.query.pull.stream.enabled = true
ksql.query.pull.table.scan.enabled = true
ksql.query.pull.thread.pool.size = 50
ksql.query.push.v2.alos.enabled = true
ksql.query.push.v2.catchup.consumer.msg.window = 50
ksql.query.push.v2.continuation.tokens.enabled = false
ksql.query.push.v2.enabled = false
ksql.query.push.v2.interpreter.enabled = true
ksql.query.push.v2.latest.reset.age.ms = 30000
ksql.query.push.v2.max.catchup.consumers = 5
ksql.query.push.v2.max.hourly.bandwidth.megabytes = 2147483647
ksql.query.push.v2.new.latest.delay.ms = 5000
ksql.query.push.v2.registry.installed = false
ksql.query.retry.backoff.initial.ms = 15000
ksql.query.retry.backoff.max.ms = 900000
ksql.query.status.running.threshold.seconds = 300
ksql.query.transient.max.bytes.buffering.total = -1
ksql.queryanonymizer.cluster_namespace = null
ksql.queryanonymizer.logs_enabled = true
ksql.readonly.topics = [_confluent.*, __confluent.*, _schemas, __consumer_offsets, __transaction_state, connect-configs, connect-offsets, connect-status, connect-statuses]
ksql.rowpartition.rowoffset.enabled = true
ksql.runtime.feature.shared.enabled = false
ksql.schema.registry.url = http:
ksql.security.extension.class = null
ksql.service.id = default_
ksql.shared.runtimes.count = 8
ksql.sink.window.change.log.additional.retention = 1000000
ksql.source.table.materialization.enabled = true
ksql.streams.shutdown.timeout.ms = 300000
ksql.suppress.buffer.size.bytes = -1
ksql.suppress.enabled = false
ksql.timestamp.throw.on.invalid = false
ksql.transient.prefix = transient_
ksql.udf.collect.metrics = false
ksql.udf.enable.security.manager = true
ksql.udfs.enabled = true
ksql.variable.substitution.enable = true
metric.reporters = []
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(io.confluent.ksql.util.KsqlConfig:376)
[2022-05-24 11:22:16,956] INFO KsqlRestConfig values:
access.control.allow.headers = []
access.control.allow.methods = []
access.control.allow.origin =
authentication.method = NONE
authentication.realm =
authentication.roles = [*]
authentication.skip.paths = []
ksql.advertised.listener = null
ksql.authentication.plugin.class = null
ksql.endpoint.logging.ignored.paths.regex =
ksql.endpoint.logging.log.queries = false
ksql.healthcheck.interval.ms = 5000
ksql.heartbeat.check.interval.ms = 200
ksql.heartbeat.discover.interval.ms = 2000
ksql.heartbeat.enable = false
ksql.heartbeat.missed.threshold.ms = 3
ksql.heartbeat.send.interval.ms = 100
ksql.heartbeat.thread.pool.size = 3
ksql.heartbeat.window.ms = 2000
ksql.idle.connection.timeout.seconds = 86400
ksql.internal.http2.max.pool.size = 3000
ksql.internal.listener = null
ksql.internal.ssl.client.authentication = NONE
ksql.lag.reporting.enable = false
ksql.lag.reporting.send.interval.ms = 5000
ksql.local.commands.location =
ksql.logging.server.rate.limited.request.paths =
ksql.logging.server.rate.limited.response.codes =
ksql.max.push.queries = 100
ksql.server.command.blocked.threshold.error.ms = 15000
ksql.server.command.response.timeout.ms = 5000
ksql.server.error.messages = class io.confluent.ksql.rest.DefaultErrorMessages
ksql.server.exception.uncaught.handler.enable = false
ksql.server.install.dir = /opt/confluent
ksql.server.preconditions = []
ksql.server.websockets.num.threads = 5
ksql.ssl.keystore.alias.external =
ksql.ssl.keystore.alias.internal =
ksql.verticle.instances = 2
ksql.worker.pool.size = 100
listeners = [http:
query.stream.disconnect.check = 1000
ssl.cipher.suites = []
ssl.client.auth = false
ssl.client.authentication = NONE
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.key.password = [hidden]
ssl.keystore.location =
ssl.keystore.password = [hidden]
ssl.keystore.reload = false
ssl.keystore.type = JKS
ssl.keystore.watch.location =
ssl.truststore.location =
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
(io.confluent.ksql.rest.server.KsqlRestConfig:376)
[2022-05-24 11:22:17,482] INFO KsqlConfig values:
ksql.access.validator.enable = auto
ksql.authorization.cache.expiry.time.secs = 30
ksql.authorization.cache.max.entries = 10000
ksql.cast.strings.preserve.nulls = true
ksql.connect.basic.auth.credentials.file =
ksql.connect.basic.auth.credentials.reload = false
ksql.connect.basic.auth.credentials.source = NONE
ksql.connect.error.handler = null
ksql.connect.request.headers.plugin = null
ksql.connect.url = http:
ksql.connect.worker.config =
ksql.create.or.replace.enabled = true
ksql.error.classifier.regex =
ksql.extension.dir = ext
ksql.headers.columns.enabled = true
ksql.hidden.topics = [_confluent.*, __confluent.*, _schemas, __consumer_offsets, __transaction_state, connect-configs, connect-offsets, connect-status, connect-statuses]
ksql.insert.into.values.enabled = true
ksql.internal.topic.min.insync.replicas = 1
ksql.internal.topic.replicas = 1
ksql.lambdas.enabled = true
ksql.metastore.backup.location =
ksql.metrics.extension = null
ksql.metrics.tags.custom =
ksql.nested.error.set.null = true
ksql.output.topic.name.prefix =
ksql.persistence.default.format.key = KAFKA
ksql.persistence.default.format.value = null
ksql.persistence.wrap.single.values = null
ksql.persistent.prefix = query_
ksql.properties.overrides.denylist = []
ksql.pull.queries.enable = true
ksql.query.cleanup.shutdown.timeout.ms = 30000
ksql.query.error.max.queue.size = 10
ksql.query.persistent.active.limit = 2147483647
ksql.query.persistent.max.bytes.buffering.total = -1
ksql.query.pull.consistency.token.enabled = false
ksql.query.pull.enable.standby.reads = false
ksql.query.pull.interpreter.enabled = true
ksql.query.pull.limit.clause.enabled = true
ksql.query.pull.max.allowed.offset.lag = 9223372036854775807
ksql.query.pull.max.concurrent.requests = 2147483647
ksql.query.pull.max.hourly.bandwidth.megabytes = 2147483647
ksql.query.pull.max.qps = 2147483647
ksql.query.pull.metrics.enabled = true
ksql.query.pull.range.scan.enabled = true
ksql.query.pull.router.thread.pool.size = 50
ksql.query.pull.stream.enabled = true
ksql.query.pull.table.scan.enabled = true
ksql.query.pull.thread.pool.size = 50
ksql.query.push.v2.alos.enabled = true
ksql.query.push.v2.catchup.consumer.msg.window = 50
ksql.query.push.v2.continuation.tokens.enabled = false
ksql.query.push.v2.enabled = false
ksql.query.push.v2.interpreter.enabled = true
ksql.query.push.v2.latest.reset.age.ms = 30000
ksql.query.push.v2.max.catchup.consumers = 5
ksql.query.push.v2.max.hourly.bandwidth.megabytes = 2147483647
ksql.query.push.v2.new.latest.delay.ms = 5000
ksql.query.push.v2.registry.installed = false
ksql.query.retry.backoff.initial.ms = 15000
ksql.query.retry.backoff.max.ms = 900000
ksql.query.status.running.threshold.seconds = 300
ksql.query.transient.max.bytes.buffering.total = -1
ksql.queryanonymizer.cluster_namespace = null
ksql.queryanonymizer.logs_enabled = true
ksql.readonly.topics = [_confluent.*, __confluent.*, _schemas, __consumer_offsets, __transaction_state, connect-configs, connect-offsets, connect-status, connect-statuses]
ksql.rowpartition.rowoffset.enabled = true
ksql.runtime.feature.shared.enabled = false
ksql.schema.registry.url = http:
ksql.security.extension.class = null
ksql.service.id = default_
ksql.shared.runtimes.count = 8
ksql.sink.window.change.log.additional.retention = 1000000
ksql.source.table.materialization.enabled = true
ksql.streams.shutdown.timeout.ms = 300000
ksql.suppress.buffer.size.bytes = -1
ksql.suppress.enabled = false
ksql.timestamp.throw.on.invalid = false
ksql.transient.prefix = transient_
ksql.udf.collect.metrics = false
ksql.udf.enable.security.manager = true
ksql.udfs.enabled = true
ksql.variable.substitution.enable = true
metric.reporters = []
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(io.confluent.ksql.util.KsqlConfig:376)
[2022-05-24 11:23:06,156] INFO JsonConverterConfig values:
converter.type = value
decimal.format = NUMERIC
schemas.cache.size = 1000
schemas.enable = false
(org.apache.kafka.connect.json.JsonConverterConfig:376)
[2022-05-24 11:23:32,941] INFO AdminClientConfig values:
bootstrap.servers = [localhost:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2022-05-24 11:23:35,925] WARN The configuration 'metrics.context.resource.version' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2022-05-24 11:23:35,926] WARN The configuration 'metrics.context.resource.commit.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2022-05-24 11:23:36,968] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:37,068] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:37,173] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:37,173] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:37,374] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:37,374] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:37,686] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:37,686] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:38,202] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:38,206] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:39,129] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:39,436] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:40,991] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:40,991] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:42,020] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:42,024] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:43,064] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:43,064] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:43,996] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:43,997] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:45,241] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:45,242] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:46,174] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:46,175] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:47,101] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:47,102] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:48,342] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:48,343] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:49,569] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:49,569] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:50,794] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:50,899] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:51,808] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:51,809] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:52,740] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:52,740] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:53,871] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:53,872] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:54,900] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:54,901] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:55,910] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:55,911] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:57,019] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:57,019] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:58,045] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:58,046] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:23:59,282] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:23:59,282] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:24:00,308] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:24:00,309] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:24:01,431] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:24:01,431] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:24:02,758] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:24:02,758] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:24:03,675] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:24:03,676] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:24:04,894] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:24:04,894] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:24:06,109] INFO [AdminClient clientId=adminclient-1] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient:1047)
[2022-05-24 11:24:06,114] WARN [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:870)
[2022-05-24 11:24:06,140] INFO [AdminClient clientId=adminclient-1] Metadata update failed (org.apache.kafka.clients.admin.internals.AdminMetadataManager:235)
org.apache.kafka.common.errors.TimeoutException: Timed out waiting for a node assignment. Call: fetchMetadata
[2022-05-24 11:24:06,143] ERROR Failed to start KSQL (io.confluent.ksql.rest.server.KsqlServerMain:70)
java.lang.RuntimeException: Failed to get Kafka cluster information
at io.confluent.ksql.services.KafkaClusterUtil.getKafkaClusterId(KafkaClusterUtil.java:107)
at io.confluent.ksql.rest.server.KsqlRestApplication.buildApplication(KsqlRestApplication.java:645)
at io.confluent.ksql.rest.server.KsqlServerMain.createExecutable(KsqlServerMain.java:162)
at io.confluent.ksql.rest.server.KsqlServerMain.main(KsqlServerMain.java:63)
Caused by: java.util.concurrent.TimeoutException
at java.base/java.util.concurrent.CompletableFuture.timedGet(CompletableFuture.java:1886)
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2021)
at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:180)
at io.confluent.ksql.services.KafkaClusterUtil.getKafkaClusterId(KafkaClusterUtil.java:105)
... 3 more
kafta@kafta-VirtualBox:~$
|
|
|
|
|
This has nothing to do with Java programming. The error messages are clearly identifying issues with the third party product. You should go to the KafkaSQL website for help.
|
|
|
|
|
Hash tables provide a mechanism by which you can create indexed tables in which the index is a value other than a string. Implement and test an integer key Open Address Hash table. Implement the following interface. • String get(int k); • void put(int k, String v); • bool contains(int k); • void delete(int k); • void printHash(); You must provide an interactive or command-line test application for the hash table. Make the hash table 31 entries, and make sure at least one collision occurs in your data input. You must delete some data from your table to demonstrate deletion. (100 points)
Notable to get output. Could you please anyone help me on this
|
|
|
|
|
Nobody is going to do your homework for you. If you don't know where to start, talk to your teacher.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
|
package classes;
import java.io.File;
import java.io.FileNotFoundException;
import java.util.ArrayList;
import java.util.Scanner;
public class Inventory {
private ArrayList <Item> listItems;
public Inventory() {
listItems = new ArrayList<Item>();
readInventory();
}
public void showInventory() {
for (int i = 0; i < listItems.size(); i++) {
System.out.println(i + 1 + ". " + listItems.get(i).getName());
}
}
public Item getItem(int index) {
if (index <= listItems.size()) {
return listItems.get(index);
}
return null;
}
public int inventorySize() {
return listItems.size();
}
private void readInventory() {
try {
File myObj = new File("data/inventory.txt");
Scanner myReader = new Scanner(myObj);
while (myReader.hasNextLine()) {
String line = myReader.nextLine();
String[] data = line.split(",");
String name = data[0];
double price = Double.parseDouble(data[1]);
Item item = new Item(name, price);
listItems.add(item);
}
myReader.close();
} catch (FileNotFoundException e) {
System.out.println("An error occurred. The inventory file not found.");
}
}
}
|
|
|
|
|
Since you didn't supply the exact error message and the line it occurs on, I'm going to take a wild ass guess and say that you don't have a class called Item defined anywhere.
|
|
|
|
|
Item class can be like
class Item {
String name;
double price;
Item( String name, double price) {
this.name=name; this.price=price;
}
String getName() { return name; }
}
|
|
|
|
|
I am developing a web application with JSF and Primefaces and I would like to integrate a simple wiki functionality which consists only of the creation and visualization of content such as course, definition of an object ... etc.
I know that there are open sources for wikis such as JSPWiki, DevWiki, but is there an other simple alternatives such as a dependency that can be integrated into project ?
Thank you for your reply
|
|
|
|
|
This is Parteek Bajpai, BE COMP student of Bharati Vidyapeeth College of Engineering, Lavale, Pune. I am here to post my problem so that anyone can resolve my problem.
import java.util.ArrayList;
import java.util.List;
public class Solution {
public static int first(List<integer> arr, int low, int high, int x, int n){
if(high >= low){
int mid = low+(high+low)/2;
if(mid == 0 || x>arr.get(mid-1) && arr.get(mid) == x){
return mid;
}
if(x > arr.get(mid)){
return first(arr, (mid+1), high, x, n);
}
else{
return first(arr, low, (mid-1), x, n);
}
}
return -1;
}
public static List<integer> relativeSorting(List<integer> arr, List<integer> brr, int n, int m) {
// Write your code here
List<integer> temp = new ArrayList<integer>(m);
List<integer> visited = new ArrayList<integer>(m);
for(int i=0; i
|
|
|
|
|
Apart from the fact that the posted code is incomplete, you have not given any information about what your problem is.
|
|
|
|
|
Hi,
I am new for java. Can anyone share with me how to read the data from database using JDBC and load the data into JTable?
Because i saw most of the sample for JTable is using hard coded data and show the data in JTable or reading the data from database with fix no of row and column or using DefaultTableModal.
May i know that besides using tablemodal, is there any way to display all data to JTable?
I am looking for something the no of row is not fix, it will depend on the no of records in the database to load into it.
Thank You.
|
|
|
|
|
|
|
public class JavaCharacterisWhitespaceExample_1 {
public static void main(String[] args) {
// Initialize three codepoints: cp1, cp2 and cp3
int cp1 = 49;
int cp2 = 121;
int cp3 = 234;
// Check whether the codepoints are whitespaces or not.
boolean check1 = Character.isWhitespace(cp1);
boolean check2 = Character.isWhitespace(cp2);
boolean check3 = Character.isWhitespace(cp3);
// Print the result.
if(check1){
System.out.print("The codepoint \'"+cp1+"\' is a whitespace character.\n");
}
else{
System.out.print("The codePoint \'"+cp1+"\' is not a whitespace character.\n");
}
if(check2){
System.out.print("The codepoint \'"+cp2+"\' is a whitespace character.\n");
}
else{
System.out.print("The codePoint \'"+cp2+"\' is not a whitespace character.\n");
}
if(check3){
System.out.print("The codepoint \'"+cp3+"\' is a whitespace character.\n");
}
else{
System.out.print("The codePoint \'"+cp3+"\' is not a whitespace character.\n");
}
}
}
Output:
-----------
The codePoint '49' is not a whitespace character.
The codePoint '121' is not a whitespace character.
The codePoint '234' is not a whitespace character.
--------------------------------------------------------------------------------------------------------
public class JavaCharacterisWhitespaceExample_2 {
public static void main(String[] args) {
// Initialize three codepoints: cp1, cp2 and cp3
int cp1 = 9;
int cp2 = 10;
int cp3 = 13;
// Check whether the codepoints are whitespaces or not.
boolean check1 = Character.isWhitespace(cp1);
boolean check2 = Character.isWhitespace(cp2);
boolean check3 = Character.isWhitespace(cp3);
// Print the result.
if(check1){
System.out.print("The codepoint \'"+cp1+"\' is a whitespace character.\n");
}
else{
System.out.print("The codePoint \'"+cp1+"\' is not a whitespace character.\n");
}
if(check2){
System.out.print("The codepoint \'"+cp2+"\' is a whitespace character.\n");
}
else{
System.out.print("The codePoint \'"+cp2+"\' is not a whitespace character.\n");
}
if(check3){
System.out.print("The codepoint \'"+cp3+"\' is a whitespace character.\n");
}
else{
System.out.print("The codePoint \'"+cp3+"\' is not a whitespace character.\n");
}
}
}
Output:
-----------
The codepoint '9' is a whitespace character.
The codepoint '10' is a whitespace character.
The codepoint '13' is a whitespace character.
My Question is why 9,10 & 13 are whitespace characters while 49, 121,234 are not, though all of them are number?
|
|
|
|
|