-
Notifications
You must be signed in to change notification settings - Fork 890
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom ColumnTypeScanType #2245
Comments
Wrt numeric types: Go doesn't have a builtin decimal type. float64 is the largest floating point type. The other thing seems to be related to gorm. I don't thing pgx should start to support gorm or any other orm. |
You're right I can use decimal for numeric types and implement Scan and Value methods for my custom type. But again my query is dynamic and I dont know the output result so I have to pass map[string]any for scan type. in this case this method in pgx gets called: func (r *Rows) ColumnTypeScanType(index int) reflect.Type {
} but this is not the behaviour I want for casting database types to golang types |
In the native pgx interface you can entirely replace the logic any given type by registering a new Codec for that PostgreSQL OID. But I don't think you can change what you want in the stdlib adapter. AFAIK, the database/sql interface doesn't allow it. |
Is there a way to change the default scanType for a pgType?
for example I have a custom type for sql date types and it has implemented Scan and Value methods, but when I have a dynamic query and pass map[string]any as gorm Scan input, it still uses time.Time{} for date types.
And also for database numeric type it is using float64 which in some cases the number overflows.
The text was updated successfully, but these errors were encountered: