11-23-2016 02:18 AM
I am new to rapidminer i have succesfully installed and configured rapidminer7.3 version its works fine the problem is when i am trying to configure hadoop it gives the below error
[Nov 23, 2016 12:39:42 PM] SEVERE: java.util.concurrent.TimeoutException
[Nov 23, 2016 12:39:42 PM] SEVERE: Hive server 2 connection test timed out. Please check that the server/daemon runs and is accessible on the address and port you specified.
[Nov 23, 2016 12:39:42 PM] SEVERE: Test failed: Hive connection
[Nov 23, 2016 12:39:42 PM] SEVERE: Connection test for 'Hadoop' failed.
My Hadoop Distribution is Apache
my hive version is hive1.2.1
and all my ports are default ports. If anybody knows please help me
Thanks in advance,
11-24-2016 05:01 AM
it is difficult to figure out the problem only from this information, but there are a couple of hints:
The error simply states that there were no response from the HiveServer2 instance (specified by either the Master Address or the Hive Server Address fields, and the Hive Port) in a given time.
I would try the following:
11-24-2016 11:17 PM
Thanks for the Reply
I have verified hive cluster through beeline it working fine . Now if i am trying to connect i am getting below error.
I have attached the hive working using beeline scrren shot and Hadoop connection in RapidMiner
If u know please help me out . Any Way thanks for ur reply
[Nov 25, 2016 9:43:25 AM] SEVERE: java.lang.RuntimeException: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/metastore/api/MetaException
[Nov 25, 2016 9:43:25 AM] SEVERE: Hive server 2 connection test failed. Please check that the server/daemon runs and is accessible on the address and port you specified.
[Nov 25, 2016 9:43:25 AM] SEVERE: Test failed: Hive connection
[Nov 25, 2016 9:43:25 AM] SEVERE: Connection test for 'Hadoop' failed.
11-25-2016 05:04 AM
The screenshots help.
The first thing is that JDBC URL Postfix field is only there for additional, custom postfix (the URL is constructed automatically). So it should be empty in your case. That could already solve the connection problem.
However, the java.lang.NoClassDefFoundError indicates that probably temporary files or folders (for example, usually in /tmp/ on Linux) of Studio may have been deleted, since the software has been started. Is that possible? If you keep getting this error, a Studio restart should help.
The address in the connection is localhost, does that mean that you are running Studio on the master node? (Beeline, of course, runs on the Hadoop node, but Studio may not.)
I would also make sure that 54310 is the port to use for the NameNode. If you navigate to localhost:50070, is that the port that the Overview page shows?