This thread looks to be a little on the old side and therefore may no longer be relevant. Please see if there is a newer thread on the subject and ensure you're using the most recent build of any software if your question regards a particular product.
This thread has been locked and is no longer accepting new posts, if you have a question regarding this topic please email us at support@mindscape.co.nz
|
Hello, I often use CLR types for fields, such as Uri, IPAddress, SecureString and many other. Would it be possible to define these mappings once in say the LightSpeedContext? That way I can use Uri whenever I want and rest assure that it would be converted to a string as needed. This is what I had in mind:
If my entity has a field of type Uri, then mapping automatically happens, even though I never specified that on the field itself.
Hopefully this would also allow me to map types supported by LightSpeed, such as string arrays, but which isn't supported by my database engine (aka MySQL). Thanks, Werner |
|
|
+1 on the feature, but I'd like to see it from the other direction: add more types handled directly by LightSpeed with no field converters or attributes necessary at all. It's a slippery slope, but Uri, IpAddress, Xml and SecureString would seem to make the cut. Is there really an app these days that doesn't have at least one of those? On the other hand, database-specific types (Microsoft.SqlServer.Types.*) might not. And is there a reason enums have to be manually tagged and converted? If the database field is a string and the column is an enum:int... Well, I guess this is more of a designer feature. I mean, it's trivial...
but it seems a little inelegant. The .NET community in general seems to be moving away from attribute soup to more of a POCO model (we'll see how that works out once they manage to add all the features back in). This may be inspired by the current crop of coders who spend more time in their blog editor than their code editor. But simpler is obviously better except when it's too simple. I like the designer (it really is the best one out there) but I understand how people who want to build classes up manually might be feeling more pain than necessary. There's a lot of legacy in there. |
|
|
Transparently handling streams with different backend storage mechanisms can be beneficial too. For example, the entity may look as follows:
And configure the stream to be stored in a blob field, Amazon S3, Azure Blob Storage or local file system. LightSpeed may request additional metadata in the table to determine the bucket and path of the stream of say Amazon S3. LightSpeed can support streaming for those databases that do, and emulate it for those that don't. In addition, encryption is a critical components when dealing with streams. Currently it is a complex task to manage the streams along with the UnitOfWork life cycle. Say for example one has a 500MB blob stored in the database. Keeping a byte array of it around eats a large amount of resources. It may be better to dump it to a temporary file and return multiple stream objects, including an intermediate stream to perform decryption. Then at the end of the request, the streams are disposed and files are deleted. One only has to look at Rugby on Rails paperclip plugin for inspiration. Thanks, Werner |
|
|
I do something similar. I made a custom type, store the connection info in the field, and have a .AsStream() method on the type. That's probably the right level of abstraction, no? Regardless, it makes me wish there was some sort of open source / extension community around LightSpeed as it seems that a LightSpeed.Contributions namespace might fill up quickly with some very handy stuff. I punted on lifecycle issues [he who calls .AsStream() must .Close() it]. Issue for me was how to handle updates. When I modify the file in the S3 bucket I wanted .UpdatedOn to match. I added a "revision" field and incremented it. As .UpdatedOn is readonly, I couldn't figure out how to force an update of just .UpdatedOn otherwise. |
|