Our article goes into a lot of detail on this topic. Making the wrong choice is what will kill any chance of platform success. Whatever solution architecture you choose there will need to be some leeway and ability to either buy your way out or develop your way out of unforeseen limitations. It could be that the platform you chose costs more money to develop versus others, or it could be that the solution architecture you chose will never meet your needs in its current form. All website hosting providers will manage their hosting in ways that will be alien to how you envisaged it working. For this reason we providing a very transparent discussion on this. The main piece of advice to take from this is to work with honest technology providers who have been there, seen it, and done it - they expect some challenges. You don't want technology providers who pretend that everything is perfect and that they have all of the answers. Go with providers that have experienced implementing solutions across on-premise, cloud architecture, virtual machines, and managed hosting.
You can see our website hosts through their affiliate link here Cheap ASP.NET Core Hosting - Click Here!.
We build powerful data- centric solutions that are web-based, cloud based, and application based. Much of our time has been spent inside enterprises and significant time has been spent outside of enterprises building solutions that can deliver maximum value in a cost optimised manner.
When reading his article we strongly advise you to not go down the path of trying to outsource this cheaply overseas or to try and do this yourself by buying one of the off the shelf CMS solutions. Neither is going to a recruitment agency to try and find a developer to do everything that you ask for because they won't operate in an objective manner. They are unlikely to have the depth of experience to implement what you are requiring.
So feel free to Contact Info rhino
Unless your organisation is a large enterprise with millions to spend In the best cloud hosting architecture teams of developers and licencing, You will face multiple challenges;
We took a look at many articles on building scalable website data platforms and we were surprised to see some very strange approaches to building website solutions. whilst many of these points may sound fair in many respects, mostly they are trite. Planning your solution architecture is vital but it must be built in a scalable and continuous improvement manner.
The idea here is that you can maintain and edit content without too much overhead when maintaining your website. We are strong proponents for using website content management systems. However CMS can suffer major performance issues due to constraints in the form of database capacity (Server and CPU), and limitations in terms of how the CMS solution was developed.
Any solution that involves more calculations and retrieval of information to a native format will incur higher resource costs. Accessing a database, transforming this information to deliver it as HTML is more resource intensive than simply returning a static HTML page from a web server. In many situations a content management system is sacrificing performance for convenience - always bear this in mind.
We think that the more components and plugins that are available to the website either through the cache or from separate headless CMS solutions are a better approach. We have already started adding cache based solutions on top of our content management system to help us be prepared for higher footfall on our website.
Optimising database queries is a no-brainer. if your database has hundreds of tenants or poorly maintained tables with high fragmentation then the performance of your website will perform terribly. The elephant in the room - we want to reduce as many hits to the database as possible. The focus of this article will cover our latest implementation for our cryptocurrency platform later on but to give a short introduction we have significant database usage and processing but we only do these queries every minutes or so . This data is then distributed to separate locations that the website can retrieve without continually hammering database. Conversely, this means we need not care so much about database performance because we are only hitting the database on a low volume basis. We can run more resource intensive queries without being concerned about database contention through high numbers of concurrent users.
Delivering information from an API is a classic way to lighten the load indeed it makes sense if we think about the payload being delivered. A website delivering HTML pages will have a much higher payload then JSON datasets in most circumstances. Additionally, if our website is using client side code to retrieve content and information from API we are (possibly) less likely to be using so much server based resources.
An API makes perfect sense but there are some significant drawbacks in exposing an API from the same website application as the website;
API's are a vital technology to help lighten the payload of websites, but will need to be hosted on separate architecture as the solution grows. For this reason just try to build the API in a relatively clear structured manner so that you can decouple this from the website when required.
This is both an incredibly sound piece of advice but also a very dangerous recommendation. We don't want to go into this point in too much depth but we must remember that all websites will have a configured maximum allocation of memory and CPU. Often, web hosts provides much more disk space than they do memory and CPU. Many website technologies will offer a certain amount of caching as part of the solution architecture.
The reason for utilising caching is that information is stored in memory without needing to go to disk or a separate database, we won't get lock contention on the database server too, we should reduce latency because this information is available to hand. A major issue with using caching is that the actual amount of memory capacity being used is not straightforward to determine and in fact as we start to get more users on our website we want every ounce of memory that is available. Dot net core offers distributed caching but surprise, surprise, backing these solutions are databases/data stores. Many technologies make it quite challenging to maintain these caches and may themselves have latency issues. It should also be remembered that most enterprise level relational databases themselves cache data to avoid server overhead and may be better than caching solutions within the website architecture.
Despite our caveats on website caching we do make substantial use of it because it fits with our scaling out approach.
By scaling out we lower the costs of more high end software. For example we could separate processing of information into smaller virtual machines or containers and reduce contention between these solutions. In monolithic server architectures we tend to find ourselves adding more server capacity to allow for higher processing loads of our information. We agree that scaling out is a preferable solution but when starting to build solution architecture it can be overly complicated to implement a properly distributed micro service based architectural approach. An example could be that we have identical data loads for different entities it would make sense to simply build a singular modular process and drop in different configuration and mappings. the challenge with this approach is we can find ourselves trying to perfect the unknown. We just don't know how this will work in the short term and only once we have been testing our solution for a good while can we start to understand how we may build a micro-processing approach.
We often find ourselves adding extra horizontal capacity because of the prohibitive costs of scaling up. There is one thing some solution architects may overlook - if we take our processing off the cloud we can have quite high capacity running relatively cheaply. Reducing your organisation's burn rate is essential.
"Don't let perfection be the enemy of the good". A desktop can run up to 32GB RAM with multiple CPU cores, we could find have one or more virtual machines hosted in the cloud to take over processing if a local machine goes down.
Sadly, most web platforms that are starting out will not support more than 50 to 100 users. less scrupulous web hosts don't make it clear what their websites can actually deliver in terms of CPU. they may provide detail on memory CPU is critical with website applications because they're constantly running calculations. We just picked one of the top CMS systems and Google searched for hosting plans to find some ridiculously cheap prices. We couldn't find any mention of memory capacity or CPU capacity on the server. Some provide metrics in terms of page load time but this seems relatively meaningless without knowing how many users were accessing this website at the time.
We take it as a given that there will always be something not quite clear when deciding to commit to third party hosting. It is the same with top cloud providers the moment you start to actually work with this technology there will be things that you didn't quite see in the short term. It is the same with buying proprietary software. You are trying to find a way to have some confidence that your platform can survive for the foreseeable future.
This August, 2023 we released our website data platform solution for cryptocurrency based reporting. We currently have multiple websites running with the same hosting provider that we have used for many years. We think they are excellent but there are some nuances that can really throw you if you are not experienced with building website data platforms.
It is worth highlighting both the benefits and drawbacks to using a web host (common to most if not all);
The above is a function of a host being able to provide you with the necessary support and reliability that you would come to expect - part of the challenge. Whomever you use, there will be certain oversights you may not have foreseen.
For this reason the single most an powerful list way to keep your website up is to take whatever processing you can off the website. this makes sense in terms of;
As an aside, having a completely separate server that is processing and storing transactional details but only posting the transactional confirmation that a website user nice access to a resource is far better than storing all of the transactional detail on the web hosting environment server. This aligns with promise theory and is a need to know basis.
We are big fans of web hosts but there are always limitations to hosting. When creating a new product service avoid over capacity, don't spend far too much money on infrastructure. Our web host offer different tiers and even dedicated server hosting which will meet our needs for the foreseeable future. As experts in many Azure resources, we are likely to use resources for certain concerns of our solution architecture on Azure and other platforms potentially. Everything needs to scale.
During load testing of our cryptocurrency processing, we found that the data processing parts of our reports manager whilst working very efficiently on a local IIS server setup was being recycled on our shared hosting. We thought about scaling up to dedicated and looked at the costs before concluding that we could not be certain that scaling up would be financially viable in the short term for this project. Did it make sense from a performance approach too? It can be a slippery slope to keep scaling up. We added extra processing types to our Report Manager within our web data platform that allowed us to control what data gets processed in a more finely tuned approach. We process reports offline on a local server and distribute this to our website platform, and publish these packaged reports to be made available to our users in a timely manner. We get data to our users faster than possible on much higher priced hosting tiers with many providers. We don't circumvent any licencing restrictions because most of our site software is bespoke. We take advantage of the web hosting provider's licensed software in a smart way.
It is important to have a couple of options for every step of your website implementation. Having options keeps you focused on the possibility that what you have implemented will need improvements. This differs to typical Agile SDLC implementations because we are avoiding planning, creating any story points, avoiding retrospectives, and simply focus on the jobs to be done when it needs to be done.
Sometimes, do the thing that makes less financial sense in the short term because it provides bigger gains in the long term. We recently scaled out our data processing architecture rather than simply scaling up our website hosting to a dedicated platform. We asked a few simple questions;
The above questions are obvious to those trying to balance costs and delivery to their customers. These are uncomfortable questions because whatever choice is taken there will be costs attached to those decisions. Remember we have to be as open to simply upscaling platform is doing additional work to distribute the processing architecture.
Our latest implementation of our crypto currency data analytics platform Is processing billions of data points every day. The amount of information being consumed is overwhelming and we already have in place mechanisms to store logs of events as they happen and also to archive this information to help us make sense of possible issues and failures within the platform.
We are capturing this information and have one or more possible mechanisms to analyse and report on this information and should be in a position to have metrics on our platform's operations.
We use industry standard logging frameworks with a clear pathway to turning off certain logging levels as our platform stabilises to reduce process on the platform.
If there are three functions {A,B,C} and there is a use for each one of these functions it is better to have a validator in-between another three operations {D,E,F} to confirm the success of these actions rather than simply having a sequential process inside a single application.
Immediately we start to see a need for event driven architecture. The justification for treating these as a set of independent events is because we cannot determine that there won't be more uses for these events in the future. For example the standard way that a user may authenticate themselves sells on the website may result in the creation of an authentication cookie, remind then determine we need to order this information, then a customer requests that they get an e-mail when a user logs in. So we keep adding more and more to the website or do we start to look at creating independent services that can respond to events?
We have built our platform in a manner that permits decoupling of many of the website application elsewhere.
Our platform runs on dot net core hosted within IIS. We know this is what web host provides because they have told us. However our hosting is almost exclusively a black box. The overriding majority of website development companies will simply build the website and push it to a live web host. They may never actually run a real test environment and will simply build it within their integrated development environment before deploying it. Even the least technical of people can appreciate how much of a disaster this may transpire to be. We have made sure to be able to package our website and deliver it to multiple instances of IIS and mirror the exact same paths and domains that we have on a live website. We can take data and content and move this to a different environment relatively simply. We can access a production databases to pull down Black Ops or simply small data sets to test this offline and potentially develop fixes and new features.
A CMS is a visual application that allows users relatively proficient in website content management to add and edit website content. Editors can add pages, web parts, pictures, videos and plugins. Many of these CMS solutions permit the incorporating of social media and other great tie-ins. There is a huge drawback to tightly coupling yourself to any content management system. Upgrades may break earlier versions and at the content that you have created won't fit the new version. This cannot be overstated enough. It will happen to any CMS platform and the more plugins that they provide will be likely to be redundant in future.
We quickly recognised that as great as a CMS solution that we use is, and that there are many other.net core CMS platforms, it is highly likely the future versions would break our existing version.
We built multiple page types for CMS and they are fairly tightly coupled to the CMS provider. There will always be a certain amount of coupling to any CMS. What we have sought to achieve is to only create agnostic components within our platform. For example you don't see a Facebook plug in written in C# on our website. Instead we have the ability to write custom plugins in JavaScript technology that can be dropped into our platform this allows us to avoid having to keep continually redeploying the website because one of our downstream providers changed how they do things.
From the many CMS platforms we have encountered, if we look into the database we will find all manner of content stored in very peculiar formats which would be almost impossible to extract and move to a different platform when the time arose. For this reason we store so much of our content outside of the website CMS. We know we can take this data and content and re purpose it if there was ever a problem with the underlying websites CMS.
Undoubtedly we will see pain when upgrading our CMS to a later version but we could realistically see much of our components existing independently from the CMS and make this transition much simpler.
This is a huge topic which deserves multiple discussions separate from this article. Especially when considering information security and possible data breaches. Many will know that proprietary data stores and databases provide additional layers of security to help prevent unauthorised access to that content. However it is interesting to see that in most situations data breaches are all about somebody stealing a database linked to the web server and then reading that information from the database. Data stored in a database is part of the problem because the data is typically structured in a fairly logical manner meaning that it is fairly easy to understand what is being presented. Obfuscation is not is not infallible but if you can at least make it harder for information to be linked the better.
Without mentioning a name - one website used for professional development has had multiple breaches. If you log into this website now, they now claim that by providing your mobile phone number it will better protect you. How can having more information about an individual protect them better? They make this claim because two factor authentication requires you to confirm you are trying to access a website or application using a mobile phone reference code. The best solution would be to have completely separate factors that are on separate devices. we have taken this approach within our platform where we recommend two separate emails with one never been accessed from a mobile phone. This would mean a hacker would have to get access to your mobile and desktop for example.
Why is it we gave the example above? It was not to criticise the well known website, it was to highlight the importance of siloing information. Storing it separately makes it harder for it to assimilated and interpreted. Better still it makes it easier for this information to be pulled into different applications in the future. It makes the information more portable. You could realistically never store personal information on a website but store this somewhere completely independently and offline. Nothing is infallible of course.
If you are a startup, we strongly advise against buying an off the shelf CMS solution and adding components plugins to it. Not only are some of these plugins very expensive, they won't scale. Approach a website development company with the view that you're building a scalable solution that can grow as your organisation grows. Commit to spending so your development company knows what they are dealing with.
So many startups make the mistake of thinking they can create a basic website using a CMS platform to then go for first round second round funding to then rewrite the solution and everything will be transported over the functionality will be improved and they will have a successful website startup business. One of the hardest things for organisations to do is to remove application functionality. Writing new code adding new features is something most developers are very comfortable in doing removing functionality and migrating it to new technology is one of the riskiest things that can be done.
We will come across as potentially biased in recommending contacting a company such as ourselves in these matters. We strongly recommend commissioning by scoping phase with an independent data solutions provider such as ourselves. Doing this gives you some degree of objective analysis on what you're trying to achieve without making it all about the website.
Almost everything that makes sense to website design agencies and website development companies when they build and implement solutions for their customers is entirely the wrong approach for customers wishing to implement data centric website content delivery platforms. They lack the skills in database development, they lack the skills in content modelling, they lack the experience in terms of implementing scalable website application architecture, they don't have the experience to push certain components to separate processing tiers - in short, they will just keep recommending to add more plugins. They will claim that a new plugin will fix performance issues and everything will be OK.
Another very important feature to consider is that many major platforms do most of their work completely independently of a website. Take PayPal, please provide a portal to log in make approve reject and monitor payments but the vast majority of it can happen completely off of the PayPal website. Furthermore they provide APIs and portals to permit third parties to link to their payment processing architecture.
It is very rare that a company wanting a website for users to interact with doesn't appreciate that in my situations those users don't really want to spend all their time on their website. Let's take our cryptocurrency platform. Users that access reports and data clearly have an advantage over those that doesn't access reports and data. Users visualising information can draw insights that may not be apparent to those simply looking at a list of numbers or reading an opinion-piece. Those that built automated solutions to interact with information that can at on new information to make decisions will get far more benefits than those simply looking at reports and dashboards intermittently day-in day-out.
We have to recognise that a website is a means to an end, well constructed informative content presented in a way of benefit to the user will always be extremely valuable. Allowing users to delegate much of the process of interacting with information to more reliable processing architecture and they would feel certain do this. Clearly we are moving towards artificial intelligence helping users to restrict the amount of time spent doing basic repetitive tasks.
We can now start to imagine what web 4.0 could be. We have to be honest about what a website should do and that much of the services we traditionally attached to a website should indeed be independent from it. Website stills remain and will remain for a very long time the key way to provide access to information and functionality for users. The bene
Written with StackEdit.