Closed saket closed 1 year ago
Yes we have considered it, and it is not impossible it will appear at some point. Once we add support for custom methods (https://github.com/realm/realm-java/issues/909) this will become a lot easier though as you can then do the conversion in the getter/setter.
That said, having a custom type adapter will reduce a lot of boilerplate code across model classes.
We're already using custom getters for conversion by using @Ignore
annotation, but with TypeAdapters, it'll become super easy :D
I'm reopening this, because we didn't have a public issue tracking this. Might as well be this one :)
@cmelchior Any update on this? This would be awesome for java.time
classes which can be used on Android using ThreeTenABP.
Hi @AndreasBackx Sorry, no update. Right now we are focusing on other features, but after 0.88 this use case should be a lot easier to handle as you can make the conversion in the getter/setters themselves.
@cmelchior yes. I'm using custom getters and setters right now. Thank you for the reply.
@cmelchior This is a reply to the comment made in #1072.
I don't see why having a simple yet atrocious workaround makes for a low priority. The example you're giving me is what I gave and what has already been talked about in this issue. The problem is that you'll have to the getter/setter method for every single instance. This is a pain and if you change on thing, you're gonna have to change all of the other ones. This can cause bugs if one forgets to change one of them.
Making an intermediary object is, as I have previously stated in my previous comment(s), adding a barrier to the data layer of your code. The few barriers people seemingly have to face with Realm is why Realm is getting more popular. But it kind of feels like I'm working with raw SQL again when I keep having to add barriers to my code.
I would much rather have something supported by Realm itself than having to do a workaround everywhere and it's not as if I could ever be the only one. I'm sure there are a lot of people (and I'm not solely talking about people running Realm) that use types that are not supported by Realm. Why should we have to create hacky workaround for what is normally regarded as good code?
I'm willing to help make a PR for this, but I would have to know what your vision is on this first.
The problem with type adapters is that for converters to be truely elegant you need to hook into our RealmTransformer, which means defining these converters will be non-trivial. Also you eventually need to map your own class to something that can be stored in Realm. That code cannot be automated as Realm is restricted by which types we can save, so the user must decide how non-standard types get converted, so the converter must be hand-written.
I never said that that code should be automated, I don't see how it would even make sense when converters are a lot more straightforward.
At which point making factory converters is just as easy.
I don't see how it is easier to maintain tens of setters and getters.
public class Foo extends RealmObject {
public String _time;
@Ignore
private LocalTime time;
public LocalTime getTime() {
if(time == null && _time != null)
time = LocalTime.parse(_time, SomeFormatter());
return time;
}
public void setTime(LocalTime time) {
this.time = time;
this._time = time == null ? null : SomeFormatter().format(time);
}
}
This would at least not format and parse it every time the field is requested but this could be changed to:
public class Foo extends RealmObject {
@Converter(LocalTimeConverter.class)
public LocalTime time;
}
public class LocalTimeConverter extends RealmConverter<LocalTime, String> {
public LocalTime fromDatabase(String value) {
return value == null ? null : LocalTime.parse(value, SomeFormatter());
}
public String toDatabase(LocalTime value) {
return value == null ? null : SomeFormatter().format(value);
}
}
It could also be added to the RealmConfiguration.Builder
. But you have to admit that this is a lot easier to maintain than copy pasting getters and setters or having to use getValue()
and setValue()
everywhere in your code.
Adding converters will also make it more performant/efficient because parsing only needs to happen in a query and formatting only needs to happen in a transaction.
I'm not sure what you mean by intermediate objects? It will never be possible to store arbitrary objects in Realm, precisely because we need to verify that all the fields in that object can be stored. This means, no matter what, you need to write the RealmPoint
class by hand + some converter functions. It is not an intermediate object, it is the presentation in Realm. Converting to some basic type like String will e.g. make it impossible to query on fields, so should be avoided. Querying in general will be tricky and you would need to map between fields as well I think.
Ideally I would like to only register the type adapter on the configuration since defining the same type adapter on multiple classes is annoying. However the information needs to be accessible from the annotation processor which probably means we need to use the @Converter
annotation.
What concerns me more is that this actually breaks our lazy loading, since we cannot return e.g. a LocalTime
that is "live". It would have to be a copy of the data from Realm.
Also the entire fact that you try to store objects from another source indicate that you are mixing concerns that probably shouldn't be mixed.
I'm not saying this feature is impossible to do, but it is a lot more complex than you make it out to be, and to be honest we don't have the bandwith to flesh it out right now. So while we appreciate your offer to help, I don't think this is an issue we can move forward with at this point in time.
@AndreasBackx I don't see how your example would work if I want to use getter/setter instead of direct field access.
Although I do admit, this is possible using Hibernate and co with
@Converter(value=ConverterClass.class)
private NotSupportedField blah;
and then Realm would automatically know that the ConverterClass
converts it to for example Date
.
Essentially doing
@Converter(external=NotSupportedField.class, realmType=Date.class, converter=ThisConverter.class)
private NotSupportedField blah;
And then Realm would create a schema type for Date
, but the setter would call the converter's method automagically to map to the internal Realm type (Date), while the getter would call the converter's method automagically to map from the internal Realm type and return this converted NotSupportedField
class instance directly.
@cmelchior I think this is what he wants.
However, would such a @Converter
affect how queries are written against a converted field? Because then you'd need to do RealmQuery.equalTo(localTimeField, localTimeValue)
which I don't see how Realm is supposed to support so easily.
@Zhuinden Yes, we have the same understanding then 😄
To iterate the list of challenges that needs to be tackled:
@cmelchior
I'm not sure what you mean by intermediate objects?
I mean creating RealmPoint
objects in order to support Point
objects.
It will never be possible to store arbitrary objects in Realm, precisely because we need to verify that all the fields in that object can be stored.
No, and I get that. This is why I'm suggesting to use converters, checks can still be run on converters, can't they?
This means, no matter what, you need to write the RealmPoint class by hand + some converter functions. It is not an intermediate object, it is the presentation in Realm. Converting to some basic type like String will e.g. make it impossible to query on fields, so should be avoided. Querying in general will be tricky and you would need to map between fields as well I think.
Simple querying should still be possible and should be based on what is stored by Realm. Let's say we want to store LocalTime
. If we store it as a Date
, then we could still query on less than and higher than. If we store is as a String
we'd have to use those comparisons instead and they wouldn't work.
Ideally I would like to only register the type adapter on the configuration since defining the same type adapter on multiple classes is annoying. However the information needs to be accessible from the annotation processor which probably means we need to use the @Converter annotation.
Having the option to define per-field converters is definitely an edge case. I don't immediately see the benefit it would bring compared to having one in Gson where it could be used to support a transitioning REST API. But I'm sure it would still be great to have the option.
What concerns me more is that this actually breaks our lazy loading, since we cannot return e.g. a LocalTime that is "live". It would have to be a copy of the data from Realm.
Could you explain this? Isn't there only an abstraction layer in between that is added and maybe I'm missing something, but I don't see how it would affect lazy loading. The getter/setter solution seems to be almost that abstraction layer.
Also the entire fact that you try to store objects from another source indicate that you are mixing concerns that probably shouldn't be mixed.
Could you explain?
I'm not saying this feature is impossible to do, but it is a lot more complex than you make it out to be, and to be honest we don't have the bandwith to flesh it out right now. So while we appreciate your offer to help, I don't think this is an issue we can move forward with at this point in time.
I never said that it was gonna be easy. But in order to make Realm in my eyes a real game changer is to make it an almost seamless move from raw POJOs to data storage and this is one of the ways you could do it. This issue could be started now and finished in a couple of months, but putting it off every single time isn't gonna get us closer to our goal.
@Zhuinden yes that is what I would love for this library to have. But I hope that second piece of code is a representation of how it is interpreted.
@cmelchior
getters/setters + public fields needs to be supported which require modifying the RealmTransformer in non-trivial ways.
I'm not familiar with the RealmTransformer, but from briefly looking at the source code I see that implements the Android Gradle plugin Transform API. Why would this require any additional bytecode changes specifically for getters/setters?
The lazy-loading elements of Realm completely go away for these 3rd party types which means it is non-trivial to reason about behaviour/performance. This might be fixable by bytecode manipulation, but will require some serious though and black magic.
Could you explain the lazy loading of elements and how adding converters would no longer make it lazy? Am I missing something in that the converters are not "simply" (yes of course they're not simple to implement) a middle man as you could say or an abstraction layer?
It will never be possible to store arbitrary objects in Realm, precisely because we need to verify that all the fields in that object can be stored.
No, and I get that. This is why I'm suggesting to use converters, checks can still be run on converters, can't they?
The checks will automatically be applied if you create any Realm Model class. My point was just that you need a Realm model class somewhere and it cannot be autogenerated.
This means, no matter what, you need to write the RealmPoint class by hand + some converter functions. It is not an intermediate object, it is the presentation in Realm. Converting to some basic type like String will e.g. make it impossible to query on fields, so should be avoided. Querying in general will be tricky and you would need to map between fields as well I think.
Simple querying should still be possible and should be based on what is stored by Realm. Let's say we want to store
LocalTime
. If we store it as aDate
, then we could still query on less than and higher than. If we store is as aString
we'd have to use those comparisons instead and they wouldn't work.
In that case querying would only work if you understood the mapping between the underlying Realm type and your own. This would very much go against our current design where you can query any field name directly.
Ideally I would like to only register the type adapter on the configuration since defining the same type adapter on multiple classes is annoying. However the information needs to be accessible from the annotation processor which probably means we need to use the @Converter annotation.
Having the option to define per-field converters is definitely an edge case. I don't immediately see the benefit it would bring compared to having one in Gson where it could be used to support a transitioning REST API. But I'm sure it would still be great to have the option.
We can probably all agree having this on the configuration would be a lot better, but there are very strict limits to what an annotation processor can see. So putting the info on the configuration might make it impossible for the annotation processor to work correctly. GSON uses reflection which is why it works there, but it is also slow. Realm uses annotation processors / bytecode changes which are faster, but also comes with certain limitations.
What concerns me more is that this actually breaks our lazy loading, since we cannot return e.g. a LocalTime that is "live". It would have to be a copy of the data from Realm.
Could you explain this? Isn't there only an abstraction layer in between that is added and maybe I'm missing something, but I don't see how it would affect lazy loading. The getter/setter solution seems to be almost that abstraction layer.
All realm data is lazy, any object is just a thin wrapper around the underlying data stored in C++. Since non-Realm classes are not designed to work that way, getting the same semantics will be very hard. Having converted classes behave differently is also very undesirable due to the cognitive load put on users when they have to reason about what is doing what.
Also the entire fact that you try to store objects from another source indicate that you are mixing concerns that probably shouldn't be mixed.
Could you explain?
A persistence layer should be isolated from the rest of your app. That you want to store arbitrary objects is understandable, but it ultimately lead to a more tightly coupled architecture where it can be hard to change things. This is also the reason that some people advocate e.g having different network, persistence, model layer and view models. See Clean Architecture: https://8thlight.com/blog/uncle-bob/2012/08/13/the-clean-architecture.html
I'm not saying this feature is impossible to do, but it is a lot more complex than you make it out to be, and to be honest we don't have the bandwith to flesh it out right now. So while we appreciate your offer to help, I don't think this is an issue we can move forward with at this point in time.
I never said that it was gonna be easy. But in order to make Realm in my eyes a real game changer is to make it an almost seamless move from raw POJOs to data storage and this is one of the ways you could do it. This issue could be started now and finished in a couple of months, but putting it off every single time isn't gonna get us closer to our goal.
True, all I'm saying is that we don't have 2 weeks right now to flesh out all the corner cases of this approach. This feature will impact a lot of functionality in Realm, so a design phase will most definitely be needed.
The checks will automatically be applied if you create any Realm Model class. My point was just that you need a Realm model class somewhere and it cannot be autogenerated.
This could be changed so that it checks the resulting value of the converter?
In that case querying would only work if you understood the mapping between the underlying Realm type and your own. This would very much go against our current design where you can query any field name directly.
What are the other options available or that you have in mind?
We can probably all agree having this on the configuration would be a lot better, but there are very strict limits to what an annotation processor can see. So putting the info on the configuration might make it impossible for the annotation processor to work correctly. GSON uses reflection which is why it works there, but it is also slow. Realm uses annotation processors / bytecode changes which are faster, but also comes with certain limitations.
I don't really mind that much about this and you're free to do whatever here (well you're always free because you guys are the maintainers).
All realm data is lazy, any object is just a thin wrapper around the underlying data stored in C++. Since non-Realm classes are not designed to work that way, getting the same semantics will be very hard. Having converted classes behave differently is also very undesirable due to the cognitive load put on users when they have to reason about what is doing what.
But is the cognitive load any lower when having to make people implement their own intermediary RealmObject
or getters/setters? I would even say it that adding converters makes it a little bit more straightforward.
A persistence layer should be isolated from the rest of your app. That you want to store arbitrary objects is understandable, but it ultimately lead to a more tightly coupled architecture where it can be hard to change things. This is also the reason that some people advocate e.g having different network, persistence, model layer and view models. See Clean Architecture: https://8thlight.com/blog/uncle-bob/2012/08/13/the-clean-architecture.html
I agree that everything should be isolated which is why I'm proposing converters in the first place. Having to work with an intermediary RealmObject
, getValue()
, and setValue()
breaks the isolation.
Storing arbitrary objects as different objects already makes it so the persistence layer can be changed with relative ease. I would argue that these arbitrary objects belong to the persistence layer and don't see how falling back to Date
objects would in any way make development easier nor would it make for a less tightly coupled architecture.
True, all I'm saying is that we don't have 2 weeks right now to flesh out all the corner cases of this approach. This feature will impact a lot of functionality in Realm, so a design phase will most definitely be needed.
I get that the Realm team is focusing on the Realm Mobile Platform as that seems to be the business model Realm is going for. However I feel less and less convinced about using Realm for bigger projects as it seems more like a gimmick because of the lots of workarounds that have no better solution. I chose for Realm because I wanted to move away from the typical ORM and raw SQL, but these roadblocks keep popping up.
I would even argue that there should be support for some sort of datetime object with timezone support because the currently supported Date
object is deprecated and should be avoided. It's also a huge pain to work with.
@cmelchior
This is also the reason that some people advocate e.g having different network, persistence, model layer and view models. See Clean Architecture: https://8thlight.com/blog/uncle-bob/2012/08/13/the-clean-architecture.html
Creating ViewModels throws Realm's lazy-loading out the window.
Or at least the strict separation that Clean Architecture tells you. Mapping from RealmResults to ArrayList is bad.
@AndreasBackx
the currently supported Date object is deprecated and should be avoided.
https://docs.oracle.com/javase/8/docs/api/java/util/Date.html#Date-long-
Doesn't look deprecated to me
@Zhuinden We should look at the usability of the Date
class. Also see the docs for JDK 7 as that doesn't include the compatibility with java.time
and that one has almost all of its methods deprecated. The Date
class is a huge pain to deal with when you have to work with a lot of time data. It's an even bigger pain when you have to deal with timezones, formatting, parsing, and having to get months, dates or any other specific element from a point in time.
Prior to JDK 1.1, the class Date had two additional functions. It allowed the interpretation of dates as year, month, day, hour, minute, and second values. It also allowed the formatting and parsing of date strings. Unfortunately, the API for these functions was not amenable to internationalization. As of JDK 1.1, the Calendar class should be used to convert between dates and time fields and the DateFormat class should be used to format and parse date strings. The corresponding methods in Date are deprecated.
Even the docs recommend using the Calendar
class instead of the Date
class. But this is all besides the issue's point.
that one has almost all of its methods deprecated.
Typically things like setHour
, getHour
, etc. for which you can use SimpleDateFormat
or the Calendar API.
Storing the date as Date
and then converting it to LocalTime
via a getter method is essentially the same as storing a long
. Which isn't all that bad; considering that's typically how you store dates in a database.
Anyways, to add @Converter
, the following would need to happen:
1.1) query("fieldname", <? extends RealmModel>
would need to be changed to query("fieldname", Object);
1.2.) query("fieldname", Object)
would check if Object is a RealmModel with instanceof
, and if it isn't, then check if there is a registered TypeAdapter for this given type; if it exists, then use TypeAdapter to convert the object value to Realm's type and use that, otherwise throw exception
2.) realm$get_()
would need to return the new, funky type in order to support direct field access; which means that realm$get_()
would need to call converter to create the non-Realm specific type from the internal Realm type
3.) realm$set_()
would need to store the funky type in order to support direct field access, which means that realm$set_()
would need to call converter to create the Realm-specific transformed data type to the internal Realm type
NOTE: This could be relevant in defining a TypeAdapter for RealmList<String>
(or just List<String>
, in which case question of generics support comes up)
4.) limitation: the problem with Calendar
for example is that you can call things like set(Calendar.HOUR_OF_DAY, 23)
etc. which modifies the object internally, but in order to actually assign the object to the RealmObject rather than have it mismatched (if the arbitrary data type is not immutable), you must manually assign it with this.field = blah
operator to call realm$set
that also does converter mapping
NOTE: This is the same limitation as modifying a byte[]
's internals.
5.) the Realm Annotation Processor should take funky field variable type into account and generate the proxy accordingly. Check if there is @Converter
annotation on it, and if yes, then call the necessary methods in generated realm$get()
, realm$set()
methods.
6.) copyToRealmOrUpdate()
and insertOrUpdate()
would probably work once the above is done, because it relies on the generated setter/getter methods of the fields that map to native.
7.) Realm-Transformer doesn't need to be changed I think, because field access can still be changed to proxy method access just like before.
8.) Registration of TypeAdapters could happen by APT finding them and generating them if they don't exist yet. Or they can be registered via RealmConfiguration, but in other projects (such as LoganSquare) I don't think that is necessary.
@cmelchior thoughts?
@Zhuinden thanks for summing it all up.
4.) limitation: the problem with Calendar for example is that you can call things like set(Calendar.HOUR_OF_DAY, 23) etc. which modifies the object internally, but in order to actually assign the object to the RealmObject rather than have it mismatched (if the arbitrary data type is not immutable), you must manually assign it with this.field = blah operator to call realm$set that also does converter mapping
This is a specific problem that should be brought to attention in the docs if it isn't already. But I'm not sure whether we can do anything about this. There also is a reason why java.time
objects are all immutable.
8.) Registration of TypeAdapters could happen by APT finding them and generating them if they don't exist yet. Or they can be registered via RealmConfiguration, but in other projects (such as LoganSquare) I don't think that is necessary.
Registering a type adapter without its annotation (so in the configuration) is setting that converter as the default. You can afaik still override it with a field-specific type converter.
LoganSquare.registerTypeConverter(LocalTime.class, new LocalTimeConverter())
Would make LocalTimeConverter
the default for the LocalTime
.
@JsonField(typeConverter = OtherLocalTimeConverter.class)
public LocalTime time;
That would override the default LocalTimeConverter
afaik and use OtherLocalTimeConverter
. We might be able to get some insight in how LoganSquare was designed by @EricKuck.
@cmelchior any thoughts?
@AndreasBackx I still think this is a good idea, it would solve the issue of https://github.com/realm/realm-java/issues/575 as well even while it's not supported by changing List<Integer>
into a comma separated String, for example.
I mean, after thinking about it above, this actually does seem like it'd solve a lot of things that you can do manually with getter/setter, but defining it globally would simplify those cases.
I slightly wonder about Schema Validation afterwards, I'm not sure I listed that. It ought to be verified if it takes type adapters into account.
@AndreasBackx Yes, something like that could probably make sense. There is a few design challenges that needs to be tackled first though, especially around how and where to specify what the backing fields look like.
E.g. we need the annotation processor to ignore any field annoted with e.g. @Converter(OtherLocalTimeConverter.class)
(that is easy), what is a lot more hard, is figuring out how exactly the OtherLocalTimeConverter
should expose and work on the extra field information, because we need to be able to combine the fields from the Model class with the fields provided by converters:
Some random thoughts:
public class Person extends RealmObject {
@Converter(MyClassConverter.class)
public MyClass field;
// A. Should backing fields be directly exposed in the Main class or only in the Converter class?
}
public MyClassConverter implements RealmFieldConverter<MyClass> {
// B. Or only in Converter classes (which will increase possibility of collision between Converters)
// C. Howto expose field information so it is available to the annotation processor?
// 1. Use all fields in the class? Simple, but perhaps too restrictive.
private String underlyingField1;
private int underylingField2;
// 2. Another annotation for schema fields? Maybe to complex?
@RealmField private String underlyingField1;
@RealmField private int underylingField2;
private boolean unrelatedField;
// 3. Other options?... Perhaps as class annotations instead of fields
@ConverterField("name", String.class)
@ConverterField("age", int.class)
public class PersonConverter extends RealmFieldConverter<Person> {
}
// D. How should the converter method look like.
// Probably need dynamic objects as input as they could be of any type
public MyClass read(DynamicRealmObject obj) {
return new MyClass(obj.getString("name"), obj.getInt("age"));
}
public write(MyClass val, DynamicRealmObject obj) {
obj.setString("name", val.getName());
obj.setInt("age", val.getAge());
}
}
I think this problem is a superset of https://github.com/realm/realm-java/issues/776
And maybe GreenDao's @Convert
could be a basis for what the API should look like
@cmelchior I like the ideas you put forth in the example. However splitting a class field into multiple database fields (that is how I interpret your example) might add a lot of complexities.I fee that it is better to use a single field that uses a type supported by Realm. As @Zhuinden says, GreenDao's solution is one that is widely adopted in other libraries. The idea that a converter is just an intermediary between a custom object and a supported object: MyClass
<-> String
for example.
The RealmConverter
or whatever the interface name will be will then need 2 generics: the class it is going to convert from and one it is going to convert to: RealmConverter<MyClass, String>
. Wouldn't it also be better if the methods were static?
Well with Room out as an alternative, it's glaring that it has support for TypeConverters while Realm does not. And that's a problem, so this needs to happen sooner or later.
Another interesting fact is that this is how they handle primitive lists in the SQLite world: https://github.com/googlesamples/android-architecture-components/blob/master/GithubBrowserSample/app/src/main/java/com/android/example/github/db/GithubTypeConverters.java
It's been a while. How's the current state of this? Really looking forward to TypeConverter
s like in Room, they're just so convenient.
Is this Work In progress? Is there a way for us to convert Date as DateTime in JodaTime
@poldz123 you can store Date (or long or String or so) and the getter can return any type you want, at the moment.
@Zhuinden yes, but for how long have we been doing this? This issue is celebrating its 2nd birthday. I feel like I'm outing my frustrations with Realm here, but I feel that it's true when I say that Realm is not focusing on their user base. There are major pain points with Realm, the biggest one is not being able to easily pass objects between threads. The threading issue, missing type adapters, and more makes Realm feel out-of-date. It has started to feel cumbersome to work with Realm compared to the alternatives that have been popping up recently. My projects that use Realm have abominable POJOs because of Realm and alternative solutions at least give me the chance to clean up that and the threading mess.
@AndreasBackx
but for how long have we been doing this?
apparently since Dec 20, 2016
the biggest one is not being able to easily pass objects between threads.
I completely disagree, I think the biggest one is the lifecycle management in which get_Instance()
is not idempotent.
Having thread-local proxy views is something you can easily figure out once you understand that it is a thread-local proxy view.
missing type adapters
This one truly is glaring because of how Room came out with it out-of-the-box, and if you look at its code, it isn't that tricky.
June of 2018 and still no support for type adapters?
End of 2018 now, any progress?
Have you considered a way to support custom data types? Maybe by registering a type converter? ORMs like GreenDao support this feature and its very useful. We're currently using the
@ignore
workaround but its not clean.