Hibernate, blob and large files

Connectivity
BLE
Fan

Hibernate, blob and large files

Hello,

I use Hibernate to read and load data into a database Terdadata.

I have large files (xml) to be stored in a table. For that I use a BLOB field.

I did not encounter any problem to store files via BTEQ, but when I go through hibernate, I get the following error :

Caused by: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata JDBC Driver] [TeraJDBC 14.10.00.17] [Error 1186] [SQLState HY000] Parameter 2 length is 129120 bytes, which is greater than the maximum 64000 bytes that can be set.


My table

ColumnName       Type                      Length                    Format

id                       INTEGER                     4                         -(10)9

binary_lob           BLOB               2 097 088 000            X(64000)

My settings :

<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE hibernate-configuration PUBLIC
"-//Hibernate/Hibernate Configuration DTD 3.0//EN"
"http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
<session-factory name="sessionFactoryToa">
<property name="hibernate.bytecode.use_reflection_optimizer">false</property>
<!-- Paramètres de connexion à la base de données -->
<property name="hibernate.dialect">org.hibernate.dialect.TeradataDialect</property>
<property name="hibernate.connection.driver_class">com.ncr.teradata.TeraDriver</property>
<property name="hibernate.connection.url">jdbc:teradata://192.168.1.210/my_dwh,CLIENT_CHARSET=cp936,TMODE=TERA,CHARSET=utf8,LOB_SUPPORT=on,LOB_TEMP_TABLE=toa.JdbcLobUpdate</property>
<property name="hibernate.connection.username">admin</property>
<property name="hibernate.connection.password">admin</property>
<property name="hibernate.current_session_context_class">org.hibernate.context.ThreadLocalSessionContext</property>
<property name="hibernate.default_schema">my_dwh</property>
<!-- Comportement pour la conservation des tables
validate: valide le schéma, aucune modification n’est faite sur la structure de la base ;
update: met à jour le schéma existant ;
create: crée le schéma en supprimant les données préalablement existantes ;
create-drop: même comportement que create avec suppression du schéma en fin de session.-->
<!-- <property name="hbm2ddl.auto">create</property> -->
<!-- Configuration hibernate -->
<property name="hibernate.search.autoregister_listeners">false</property>
<property name="hibernate.show_sql">true</property>
<property name="current_session_context_class">thread</property>
<property name="hibernate.generate_statistics">false</property>
<property name="hibernate.use_sql_comments">false</property>
<!-- Pool de connexion C3P0 -->
<property name="hibernate.c3p0.validate">true</property>
<property name="connection.provider_class">org.hibernate.connection.C3P0ConnectionProvider</property>
<property name="hibernate.c3p0.min_size">2</property>
<property name="hibernate.c3p0.max_size">20</property>
<property name="hibernate.c3p0.timeout">14000</property>
<property name="hibernate.c3p0.max_statements">50</property>
<property name="hibernate.c3p0.preferredTestQuery">SELECT 1;</property>
<property name="hibernate.c3p0.testConnectionOnCheckout">true</property>
<!-- Mapping avec les bases -->
<mapping resource="fr/trimane/TOA/bean/dwh/Dimension.hbm.xml" />
<mapping resource="fr/trimane/TOA/bean/dwh/Fact1.hbm.xml" />
<mapping resource="fr/trimane/TOA/bean/dwh/Fact2.hbm.xml" />
<mapping resource="fr/trimane/TOA/bean/dwh/Fact3.hbm.xml" />
<mapping resource="fr/trimane/TOA/bean/dwh/Fact4.hbm.xml" />
<mapping resource="fr/trimane/TOA/bean/dwh/Fact10000.hbm.xml" />
</session-factory>
</hibernate-configuration>

Is it possible to load large files from Hibernate ?

I forgot something in my setting ?

Thank you

Tags (2)
7 REPLIES
Junior Supporter

Re: Hibernate, blob and large files

Hi.

I don't know much about hibernate (I've learnt enough to hate it though) but, why does your 2MB BLOB column seem to have a 64KB format?

Cheers.

Carlos.

Teradata Employee

Re: Hibernate, blob and large files

Do not use connection parameter CLIENT_CHARSET=cp936 in conjunction with CHARSET=utf8. You are very likely to corrupt your data by doing that.

Generally speaking, it is a bad idea to use the CLIENT_CHARSET connection parameter. But if you must use it, then it should only be used in conjunction with CHARSET=ASCII or omitted CHARSET (the default is CHARSET=ASCII).

BLE
Fan

Re: Hibernate, blob and large files

Hello,

My BLOB column can contain up to 2GB. 

For the 64K format, if I understand, this is the cutting perform Teradata.

I understood that Teradata stores the lob fields in another table. Store file is divided into packet of 64K.

I removed the parameter CLIENT_CHARSET, unfortunately I do not always manage to save my files :(

Teradata Employee

Re: Hibernate, blob and large files

The Teradata JDBC Driver Engineering team is investigating your issue. We will reply again after our investigation.

Re: Hibernate, blob and large files

Hi!

I've same probleme with my DB on teradata, i wanna save Binary Files (BLOB) wich size exceed 100MB. 

My JDBC connexions are managed by EclipseLink and the config file contains the following :

<persistence-unit name="teradata">

<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>

<mapping-file>META-INF/eclipselink-orm.xml</mapping-file>

<class>my_db.MyTable</class>

<exclude-unlisted-classes>true</exclude-unlisted-classes>

<properties>
<property name="eclipselink.logging.level" value="INFO" />
<property name="eclipselink.jdbc.driver" value="com.ncr.teradata.TeraDriver" />
<property name="eclipselink.jdbc.url"value="jdbc:teradata://localhost/my_db,CLIENT_CHARSET=cp936,TMODE=TERA,CHARSET=utf8,LOB_SUPPORT=on" />
<property name="eclipselink.jdbc.user" value="myValue" />
<property name="eclipselink.jdbc.password" value="myValue" />
<property name="eclipselink.ddl-generation" value="none" />
<property name="eclipselink.ddl-generation.output-mode" value="database" />
<property name="eclipselink.orm.throw.exceptions" value="true" />
</properties>

</persistence-unit>

thanks

Teradata Employee

Re: Hibernate, blob and large files

bchalal,

Do not use connection parameter CLIENT_CHARSET=cp936 in conjunction with CHARSET=utf8. You are very likely to corrupt your data by doing that.

Generally speaking, it is a bad idea to use the CLIENT_CHARSET connection parameter. But if you must use it, then it should only be used in conjunction with CHARSET=ASCII or omitted CHARSET (the default is CHARSET=ASCII).

Teradata Employee

Re: Hibernate, blob and large files

Teradata is capable of supporting BLOBS over 64000 bytes but there is a 64000 byte limit when:

PreparedStatement.setBytes (int parameterIndex, byte [] x) 

is called and that is the limit you appear to be hitting.

This hibernate code from org.hibernate.type.descriptor.sql.BlobTypeDescriptor is where the decision to use setBytes is made when it sets the descriptor to  PRIMITIVE_ARRAY_BINDING as seen below:

@Override
public <X> BasicBinder<X> getBlobBinder(final JavaTypeDescriptor<X> javaTypeDescriptor) {
return new BasicBinder<X>( javaTypeDescriptor, this ) {
@Override
protected void doBind(PreparedStatement st, X value, int index, WrapperOptions options) throws SQLException {
BlobTypeDescriptor descriptor = BLOB_BINDING;
if ( byte[].class.isInstance( value ) ) {
// performance shortcut for binding BLOB data in byte[] format
descriptor = PRIMITIVE_ARRAY_BINDING;
}
else if ( options.useStreamForLobBinding() ) {
descriptor = STREAM_BINDING;
}
descriptor.getBlobBinder( javaTypeDescriptor ).doBind( st, value, index, options );
}
};
}
};

You can work around this by using

session.getLobHelper().createBlob(byte[] ba);

to create the Blob.

This will later access the same getBlobBinder code but will set the descriptor to STREAM_BINDING and bypass the 64000 byte restriction.