This thread looks to be a little on the old side and therefore may no longer be relevant. Please see if there is a newer thread on the subject and ensure you're using the most recent build of any software if your question regards a particular product.
This thread has been locked and is no longer accepting new posts, if you have a question regarding this topic please email us at support@mindscape.co.nz
|
Do you have any plans to do you have any plans to support "Sql AzureElastic Scale" ???
more info: do you have any ideas how i can use their api with your lightspeed connections??? |
|
|
No we have no current plans to support this as it is more of an application specific concern. I dont have any personal experience with this so Im afraid I have no suggestions on its use. You are welcome to pop a feature request though on the request forum: http://www.mindscapehq.com/thinktank/product/9 - it would be useful if you could elaborate on how you would be looking to see this integrated as well.
|
|
|
Hi Jeremy, I've used your product for over 5 years, and i love it and your support has always been great. I no you've said that you have no plans yet to support [azure sql elastic scale], but can you give me some guidance on how to implement it with lightspeed? here is an example with Entity Framework ==> https://code.msdn.microsoft.com/Elastic-Scale-with-Azure-bae904ba?SRC=VSIDE ==> Basically, they subclass the datacontext to support the sharding map. ==> i think it is pretty easy to do something similar with the connection object for lightspeed. ==> i just need some guidance on where to do this best, relative to the constructor of lightspeed or unitofwork. I think since lightspeed already supports azure, i think the [elastic scale] is going to be a big boost for both azure and lightspeed. Thanks much, alex. |
|
|
Hi Alex, If there is some way of asking for the raw connection string from the ShardMapManager for a given shard then you can use this to instantiate a new LightSpeedContext with the associated connection string. That would seem to be the logical integration point.
|
|