"Re: About Rapidminer Studio using Radoop - Hive problem"

phellingerphellinger Employee, Member Posts: 103 RM Engineering
edited June 2019 in Help

Hi Maziar,


Do you use the same Username and Password in the SQL Settings part of the connection dialog as you use in Beeline?

That username is usually hive, except when you are using LDAP or Kerberos, for example.


(You can check the JDBC URL that Radoop builds, if you enable the Log panel via View -> Show Panel -> Log, and set the log level to FINE (right click on the panel -> Set log level -> FINE) and run a test. You should see a line saying "Connecting to Hive. JDBC url:". You can ignore the radoop_hive_0.13.0 prefix, the rest should look similar to the URL that you use in Beeline.)






  • Options
    tasbihmrtasbihmr Member Posts: 9 Contributor I

    Dear All,

    I get an error regarding Rapidminer Radoop. I have downloaded and installed the Radoop extension into the Rapidminer Studio latest version. I can see all the extensions installed. I click the Manage Radoop Extension and input the fields for Hadoop and Hive, by loading the Hadoop configuration files, and Hive seems everything is ok. My Hive version is also latest, and I have the jdbc:hive2 connection via beeline command tool and it connects to the Hive databases over mySQL. However the Radidminer Studio connection fails with the error:

    " serverProtocolVersion" is unset" .


    Also I get the following output:


    (org.apache.hadoop.security.authorize.AuthorizationException): User: maziar is not allowed to impersonate root
        at org.apache.hadoop.ipc.Client.call(Client.java:1470)
        at org.apache.hadoop.ipc.Client.call(Client.java:1401)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
        at com.sun.proxy.$Proxy30.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
        at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
        at com.sun.proxy.$Proxy31.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1977)
        at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
        at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
        at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:674)
        at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:622)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:550)
        ... 28 more



    User "maziar" is my Hadoop user and user "root" is the mySQL Hive user.

    Can anyone help me with this error? Please note all the installed version of Rapidminer Studion, Hadoop, Hive are latest versions. Can you help me solve this problem ?

    Best regards,


  • Options
    Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 1,761 Unicorn

    I'm going to move this thread to the Radoop forum.


  • Options
    tasbihmrtasbihmr Member Posts: 9 Contributor I

    Thank you Peter for your reply.

    It was very helpful. However I have a problem as following:

    I can use beeline from command line, and once inside I can use the command "!connect jdbc:hive2:// " for connection, I am then prompted for user and password which is actually beeline asking me for credentials for connection to hive. My hive is setup as essentially mySQL database and not Apache Derby. The credentials for connection to mySQL database is root:root , so when I apply this to beeline, I get a jdbc:hive2 connection, I can then see the hive tables inside mySQL.

    I do't know how I can apply this to Radoop, since Radoop is giving an error that user and password are not correct, and also I cannot use the URL jdbc:hive2:// , it has to be jdbc:hive2://localhost:10000 . If I use this actuall URL in beeline, I also get an error for credentials.


    So where do you think is I am making the mistake?


    Considering that the problem is largely related to credentials, then how come when I use the full URL string in beeline, I get the same credentials error, and not if I use the short format of the URL like I explained above it works at least in beeline? Is my problem in Hadoop installation? For me the Hadoop installation is using the user and password, maziar:admin, while Hive is using the user and password root:root, which is also the user and password for the mySQL database where Hive is accessing.




Sign In or Register to comment.