Recent research by Akamai shows that even a 1 second delay in page response can lead to a 7% reduction in conversions. Going by this data, if my sales are $500 per day, then that's nearly $13k loss every year. Given the digital age that we live in websites are required to display large amounts of data in grids. However, rendering millions of records in a grid can pose significant performance challenges, leading to slow load times and poor user experience. In this blog post, we'll explore how to optimize the performance of a website's grid using a .NET Core API and SQL Server, focusing on backend optimization techniques.

In ideal scenario, we are using paging at client side. When we don't have large amount of data to render on UI. So, for large data, we need Pagination and filtering from server side.

Here are some benefits associated with it:

Improved Performance

As we are applying filtering criteria and fetching only the necessary data for each, we are reducing the amount of data transferred between the server and client. This minimizes network latency and improves response times, that leads to responsive user experience.

Reduced Server Load

The use of pagination and filtering can also reduce the processing load on the server by limiting the amount of data processed and returned in each request. This prevents unnecessary strain on the server and allows it to handle more concurrent requests efficiently, leading to better scalability and resource utilization.

Optimized Database Queries

As we are using techniques like Skip () and Take () in Entity Framework, it generates efficient SQL queries that leverage database indexing and query optimization. By fetching the smaller chunks of data and applying filtering conditions directly in the database query, we can substantially minimize the database load and optimize query execution times.

Enhanced User Experience

User experience is the crux of all practices. Pagination and filtering empower users to navigate through large datasets better and find the information they are looking for. Also, this entire process is relatively quicker. Front end website grids and tables work better when they load fast, not just that, users can also swipe or click around easily. Users can thus benefit from faster loading times and smoother interactions, enhancing the overall application usability.


As your app starts performing better and seamlessly handles more user data, the power duo of pagination and filtering can help these apps maintain more responsiveness by restricting the negative impact of large datasets on server resources and client-side rendering.

Reduced Network Bandwidth

Pagination basically lets you download the website in smaller bits instead of all at once. This is particularly useful for people with limited bandwidth or those who prefer accessing websites via their mobile devices. Such users can benefit from faster loading time and reduced costs of data usage.

Dynamic Data

Imagine you're watching a live feed, like stock prices or a chatroom. With server-side paging, the website updates directly from the source offering you the most up-to-date data. On the other hand, client-side paging might miss these updates. Therefore, server-side paging keeps you current and is crucial for applications with fluctuating data.

Data Cleanup for Current Data

For apps that need to be current, cleaning up old stuff can help it run faster. Data cleanup mechanisms can help prevent server-side paging. By regularly cleaning up outdated or irrelevant data, the dataset size remains manageable, reducing the need for extensive server-side paging. This approach can optimize an app's performance by minimizing the overhead associated with managing large datasets.

Client-side pagination involves fetching the entire dataset from the server and then splitting it into pages on the client side. It's more like downloading an entire photo album at once and then flipping through it on your phone. That's kind of like client-side pagination for websites.

Issues with Client-Side Pagination

While it may seem simpler to implement, the client-side pagination has some downsides and limitations compared to server-side pagination. Let’s break them down:

Performance Impact

Did you know that if your website just loaded 1 second faster, your conversion rate can go up by 17%. That said, downloading and processing large datasets on the client side can compromise your app’s performance and cause slow page load times, increased memory usage, and poor user experience. This is especially true for devices with limited resources or slower networks.

Data Overload

Now, what client-side pagination does is load the entire dataset into the browser, and with large datasets, this process becomes quite inefficient and impractical. This can result in lags and browsers crashes when users navigate through pages with a significant amount of data.

Bandwidth Consumption

Another issue lies in the bandwidth consumption when transferring large datasets over the network to the client side. This often proves costly for users with limited data plans or slower internet connections. Especially for mobile devices, it means more data usage and slower page loading times.

Security Concerns

There are also some security concerns associated with exposing entire datasets to the client side, especially if the data is sensitive or confidential. Client-side pagination may result in data breaches or unauthorized access of confidential data.

Limited Scalability

Imagine an application with millions of photos or a very high number of concurrent users. Unfortunately, for such applications client-side pagination may not scale well because processing and rendering such large datasets in the browser can overwhelm the client's resources and degrade performance. This can make these apps sluggish, leading to poor user experience.

Use Cases for Client-Side Pagination

Now let’s look at the brighter side of the picture. While the client-side pagination has its associated drawbacks, it is also suitable in certain scenarios. Let’s take a look:

Small Datasets

Are you dealing with small amounts of data? Well, then the client-side pagination will work well. It can comfortably load and process this data in the browser without hampering the performance.

Data Exploration

Looking to explore or analyze small datasets interactively? Then, you can rely on client-side pagination to provide a responsive and intuitive user experience.

Offline Access

But that’s not all. Client-side pagination is also useful when dealing with offline applications or scenarios where users need to access data locally, like accessing a downloaded map on your device. You can do all this without depending on server-side resources or good network connectivity.

Pros and Cons of Server-Side Pagination vs. Client-Side Pagination

Let us now discuss the upsides and downsides of server-side and client-side pagination for a better understanding of the two:

Server-Side Pagination


  • Efficient use of server resources.
  • Better performance for large datasets.
  • Reduced bandwidth consumption.
  • Enforces security by limiting the exposure of data.


  • More requests to the server, as each page request requires a new set of data.
  • Necessitates additional server-side logic and infrastructure.
  • May involve more complex implementation.

Client-Side Pagination


  • Simplified implementation.
  • Suitable for small datasets and simple applications.
  • Provides a responsive user experience for local data processing.


  • Performance issues for large datasets.
  • Increased memory usage and bandwidth consumption.
  • Limited scalability and security risks.

While client-side pagination may work for certain use cases, especially for smaller datasets, one must depend on server-side pagination for better performance, scalability, and security, especially when dealing with larger or sensitive datasets.

So, when exactly to use server-side pagination? Here are some situations:

Large Datasets

Server-side pagination is the key when dealing with large datasets. Using it will help you avoid loading the entire dataset into memory and transferring it to the client side. It helps you fetch only the necessary subset of data from the server, thus improving performance and reducing bandwidth consumption.

Another benefit of using server-side pagination is that it ensures that only a manageable chunk of data is processed and transferred over the network, thus improving the loading speed and offering a better user experience.


Server-side pagination scales efficiently and are suitable for applications with a growing user base. By distributing the processing load on the server and churning out data in smaller chunks, it effectively handles high volume of website traffic and increasing data volume, without compromising performance.

The server can thus optimize database queries, apply caching mechanisms, and leverage indexing to efficiently retrieve and deliver paginated data to clients.


Server-side pagination maintains data security by limiting access to sensitive information. The client only gets access for the requested page of data, thus restricting exposure to unauthorized users.

Server-side paging keeps your secret information safe on the server, not downloaded to everyone's devices. This prevents unauthorized access and makes it harder for hackers to steal confidential information.


Server-side pagination keeps consistency when it comes to data presentation. This ensures that everyone sees the same data on each page, no matter what device they're using, their device capabilities or configurations.

This consistency enhances user experience and facilitates seamless navigation through paginated data.


Server-side pagination enables optimization of database queries, reducing the overhead of retrieving and processing large datasets. Server-side paging helps the server work smarter, not harder. It uses indexing, filtering, and sorting to minimize database load and improve query performance.

Additionally, it allows implementation of caching mechanisms to optimize data retrieval and improve application performance.

Server-side pagination can be a safe bet when using applications with large datasets, requiring scalability, ensuring data security, maintaining consistency, and optimizing data retrieval and performance. It offers better control over data processing, thus improving its application scalability, and overall user experience.

However, different situations call for diverse techniques:

One can use different techniques and strategies, depending on the requirements and constraints of the app. This can offer better performance, scalability, and an ideal user experience. Here are some additional techniques:

Data Caching

Use caching mechanisms to store the data that you access frequently, store such data in memory or a distributed cache (e.g., Redis). This will reduce the need to fetch data repeatedly from the database, thus substantially improving the response times and reducing database load.

Here are some caching strategies one can use: in-memory caching, output caching, and query caching to level up their performance for different datasets and operations.

Data Denormalization

Denormalize database tables by storing redundant or precomputed data to optimize read performance. Denormalization reduces the need for complex joins and calculations at runtime, improving query performance and response times.

Analyze query patterns to identify any opportunities for denormalization and optimize data schema accordingly.

Content Delivery Networks (CDNs)

A CDN is like having smaller warehouses closer to different user bases instead of one giant warehouse. Use CDNs to cache and deliver static assets (e.g., images, CSS files, JavaScript files) closer to users, thus bringing down latency and improving load times. CDNs are distributed across a network of edge servers, ensuring speedy delivery to users irrespective of where they are located.

Integrate CDN support into the application architecture to provide static content efficiently and enhance the overall performance.

Asynchronous Processing

Asynchronous processing is like having some extra help in a busy restaurant with a single waiter. It offloads long-running or resource-intensive tasks in the background and frees up server resources, thus allowing the application to handle continuous requests more efficiently.

Implement asynchronous patterns such as message queues, background jobs, and event-driven architectures to decouple big components and improve the scalability of your application.

Load Balancing and Horizontal Scaling

Load balancing is like having a bunch of computers working together. Deploy load balancers and distribute the incoming traffic across multiple servers or nodes, ensuring optimal utilization and high availability of resources. This can significantly improve fault tolerance and scalability by evenly distributing requests among the available backend servers.

Implement horizontal scaling and add more server instances or nodes to your application infrastructure as its demand grows. This can help increase the application capacity and enable it to handle a high volume of concurrent user requests.

Query Optimization and Indexing

In this, websites ask questions to a big data room (database) to find what they need. It is important to optimize database queries by analyzing query execution plans, identifying performance bottlenecks, and optimizing SQL queries, joins, and indexes. Use database management tools to monitor and analyze the performance metrics of queries and identify optimization opportunities.

Create indexes on frequently queried columns to improve query performance and reduce the time required to fetch data from the database. Regularly review and update indexes based on usage patterns and query performance metrics.

Content Compression and Minification

Compress static assets like HTML, CSS, JavaScript) to reduce their file sizes and reduce network latency. Content compression techniques such as GZIP compression and Brotli compression can bring down the amount of data transferred over the network. This can significantly improve page load times and user experience.

Use server-side and client-side tools to automate content compression and compressing processes, thus ensuring optimal performance for web applications.

Progressive Rendering and Lazy Loading

Imagine a website loading piece by piece, like a picture slowly showing up. This lets you start using the website even before everything is finished downloading, making it feel faster. Implement progressive rendering techniques to prioritize loading and rendering critical content first to improve user engagement and perceived performance. It enables users to interact with the application while the additional content is being loaded.

Use lazy loading for non-essential or below-the-fold content to defer loading until it's needed, reducing initial page load times and conserving bandwidth. Lazy loading optimizes resource usage and improves overall performance for content-heavy pages.

Use these methodologies and strategies as per your specific requirements and application challenges. These techniques help developers optimize app performance, scalability, and user experience in general.

In this code,we have created ColumnSearchModel to filter multiple columns in which SearchTerm contains value and SearchColumn contains column name on which filter value apply.

SearchFilterAttributeModel is passing in API which contains List of ColumnSearchModel to filter multiple columns at a time, PageIndex is Page No. you want to show and PageSize is No. of records you want to show in a page and UserId is curruntly login user id.

SearchProfileResponceModel is user model that has columns you want to display in grid.

GridResponceModel is responce of API, which contains List of SearchProfileResponceModel and TotalCount. TotalCount is total records qualify when filters apply. after that paging will apply and returns user profiles.

SearchProfile called with searchAttributes and returns gridResponceModel.

We will store linq query in data of joining multiple table.

it Joins two tables Employees and Departments as emp and dept.

we will only apply filter, if any ColumnSearchModel available.

For each column filter will apply by checking column name in Switch statement.

Filtering condtion apply on data query depending on column name.

Total count of calculated data passing in gridResponceModel.TotalCount

We will apply Skip and Take to pData.Skip will skip no. of records of previous pages and Take will get no. of records you want to display in current page.

Using Skip and Take we will get only required No. of records for page.

After using this approach from API side, we will not require lazy loading from UI side because api will send data of single page for Grid.

It will also send TotalCount as total no. of records that helps you to manage pages on UI.We can use Math.ceil(TotalCount / PageSize), where it gets nearest large integer to get total pages.


Please fill this for and we'll get back to you as soon as possible!

Open chat
Hey, Let's connect...
Welcome to Akkomplish!!!

How can we help you?