|
So, you have data coming in from 200 devices, 5 times a second. Let's think about the problem areas here:
- When you add data, at that volume, you're adding 1000 items a second. Not a problem in SQL Server.
- I'm assuming it's one aggregation per device every minute, so you're looking at a read of 60K rows every minute, with a write of 200 rows.
I would be tempted to invert the problem slightly. Rather than write to one source, write to two. So, write the incoming messages to SQL Server and, at the same time, write them to something like Redis Cache. Then, every minute, get your data from Redis and use that to calculate the values for the other table and write to that. This approach follows (somewhat), the CQRS so you're separating the reads and writes from each other.
The beauty is, this scales well.
|
|
|
|
|
That was actually a brilliant Idea, as I already have Redis running
The number of devices will hopefully increase 10 to a 100-fold, but this will still scale.
So will it with additional transformations in the future.
Thanks!
The only drawback with this solution is that I have no excuse to sit down and play with Datalakes
and all its tools...
|
|
|
|
|
What worries you most? Is it the data volume? A back of the napkin calculation shows something like 10-20k per sec. Almost any DB system could handle that.
Mircea
|
|
|
|
|
Volume, Yes.
As of today, the volume is not a problem at all.
But when we get thousands of devices online, I'm a little bit afraid that the
Job generating the aggregates will struggle/lead to locks.
Also, who knows what other Post-processing needs will occur in the future.
|
|
|
|
|
How many more devices you anticipate to connect: 20000, 200000? At 20000 level, I think you should still be safe. At 200000, not so sure.
Keep in mind however that inserting is the more probable bottleneck as it has to create all those indexes, allocate pages, etc. Querying should be much faster.
Mircea
|
|
|
|
|
Thanks!
|
|
|
|
|
In one of the sub-threads you said the following
"The number of devices will hopefully increase 10 to a 100-fold"
You should have a very hard idea of what this increase looks like before you start making decisions. You should know the following.
1. What is the timeline for the increase? Also will the increase be gradual or will it be abrupt?
2. What is the maximum? If you owned the market what size would you need to support?
Typically when I size I do the following
1. Document my assumptions.
2. Document my sources - where did I get the assumptions and how did I derive them.
3. I then increase the size from that by 3 to 10.
Don't attempt to create a system the will work for the next 100 years. One that works for 10 is fine.
After you have those numbers then you start looking for solutions that will handle those numbers. Keeping in mind of course that your architecture/design should support the larger values, but the implementation does not need to support that now. You just want to make sure the implementation does not preclude sizing it up.
|
|
|
|
|
Thanks!
Very good points.
The question was more about what would a datalake offer in terms of post processingand other benefits, since I am not familiar with the technology.
I have no problems implementing an efficient solution with Azure/sql/c# for the foreseeable future.
Just needed some insights in a tech I'm not familiar with.
|
|
|
|
|
i am building an application in which one of the part is calendar updates , push notification ,
i am wondering what is best way/architectural pattern/technologies are best suited for this scenario . I am a .net developer .if possible kindly guide me on same platform
|
|
|
|
|
|
erummirza wrote: push notification
Pushing requires that the destination is available and capable of accepting the request. That seems unlikely for "calendar" in general since it would suggest client machines (which can be off or have no network access.)
So you are going to need to refine your requirements more before you can do anything.
If the requirements are very specific, such as using a MS Exchange server, then there still is not enough detail for an "architecture". But in that case you would start with how you are going to get the updates in the first place.
|
|
|
|
|
I have a framework that I'vev been developing in .Net 4.5.2 over the years. Its all stored in individual repos in GitHub.
There are 25 individual projects. Now I want to increment the version to 4.7.2, but when I try to update the versions, it breaks some apps.
So, what's the right way to set this all up in GitHub? Would you create a branch for each version?
In theory, theory and practice are the same. But in practice, they never are.”
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
|
|
|
|
|
If you update the target framework for an assembly, then you need to update the target for anything that references that assembly. That should only be a problem if you're not updating the target for everything, which doesn't sound like the case here.
If you start from the bottom up, then you may get errors when you first open a project that targets 4.5.x and references an assembly you've just updated to 4.7.x; but those should go away once you change that project's target framework.
If that bothers you, then it may be easiest to work from the top down; start with the applications, then the assemblies they reference, then any transitive references, and so on. Once that's done, you can work your way back up, updating the references at each level.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
The problem us that I still have apps that target my 4.5 assemblies, so I think I need to create copies (new versions) of my assemblies.
I was considering create a new branch under MyAssembly called 4.5, then another called called 4.7, etc.
In theory, theory and practice are the same. But in practice, they never are.”
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
|
|
|
|
|
If the apps target 4.5, then the assemblies they use need to target 4.5.
Creating separate branches might work; but there's a good chance the branches would drift further and further apart over time, making it harder to apply bug fixes to them all.
I'd be more inclined to use multi-targeting[^] - have one branch, with one project, building for multiple framework versions. Where needed, you can use conditional compilation to add features to the 4.7 version that won't be included in the 4.5 version.
How to multitarget | Develop libraries with the .NET CLI - .NET | Microsoft Learn[^]
If you're updating the apps to at least 4.7.2, then it would probably be a good idea to have the libraries target .NET Standard 2.0[^], the last "standard" version supported by .NET Framework.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
I'm a part of a Members Only club where I'm working on an app for them. We have coded membership cards.
They have a keypad to open the door and they want to implement an electronic Card Reader Door Lock and we want to build it into the app so they can lock people out as needed, say when their membership expires.
This would be like a hotel door lock, except I only need one lock, and, most importantly, I need to be able to build it into my app, so I need a solution that comes with an API I can tap into.
Anyone know of anything like this?
In theory, theory and practice are the same. But in practice, they never are.”
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
|
|
|
|
|
Sounds like a job for NFC. Many phones these days do NFC, and for the members that don't have one, there's a "credit card" option.
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Can you elaborate on this a little more? I looked up 'NFC', but the results are a bit vauge
In theory, theory and practice are the same. But in practice, they never are.”
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
|
|
|
|
|
Near Field Communication - The range is only a couple of inches.
Someone's phone can be used as a "key" to unlock the door.
|
|
|
|
|
|
Peter_in_2780 wrote: Also it might be worth chatting to @glennPattonWork . He's known to dabble in such things.
How do I reach out to Glenn?. I click this link and it takes me to his profile, but I don't see any way to message him
In theory, theory and practice are the same. But in practice, they never are.”
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
|
|
|
|
|
If you post something in a forum here with an @username in the body of the message, they should receive a notification that they have been mentioned.
See what happens, @kevinmarois !
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Poor old username[^] - a lurking member for nearly 21 years, and they get an email every time someone wants to explain how the "mentions" system works.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Never fear, Glenn is here. help you!
Dabble is correct, I am trying to convince people they are actually using NFC while they claim to be using RFID.
NFC as other have said is a short range protocol for reading and writing data between a tag. I think there are limits imposed on the range you can operate at. I'm fairly sure these are imposed by the power applied to the antenna. I'm more an expert on RFID though. If I can help I will...
|
|
|
|
|
I am remodeling the restrooms in a Business building that has already been built, do I need to use the most recent version of ICC A117.1 or can I use the ICC 117.1-2009 standards?
|
|
|
|