Monday, November 3, 2008

Ajax

Introduction
Asynchronous JavaScript and xml (AJAX) is now the need of the new generation websites. In the asp.net using AJAX is very easy. We just place few line codes in the web.config file and start using Ajax tags. But you should download Ajaxtoolkit first from the following Microsoft link…

http://www.microsoft.com/downloads/details.aspx?FamilyID=ca9d90fa-e8c9-42e3-aa19-08e2c027f5d6&displaylang=en

Requirements:
(1) Microsoft .NET Framework Version 2.0,
(2) IE 5.01 or later
(3) Windows 2000; Windows Server 2003; Windows Vista; Windows XP
After download this exe you will get the additional option in your .net toolbox is Ajax Extensions.

Now two ways to use ajax
(1) Create a new ajax enabled web site. File->new ->ajax enabled web site
OR
(2) Changes made in your existing asp.net2.0 website.
You must add the following lines in your web.config files.
Add these lines in

--httpHandlers--
--remove verb="*" path="*.asmx"--
--add verb="*" path="*.asmx" validate="false" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=1.0.61025.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"--
--add verb="GET,HEAD" path="ScriptResource.axd" type="System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=1.0.61025.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" validate="false"--
--httpHandlers--
--httpModules--
--add name="ScriptModule" type="System.Web.Handlers.ScriptModule, System.Web.Extensions, Version=1.0.61025.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"--
--httpModules--

If you think why I add these lines in you web.config file then the answer is these lines you can find in you system
C or other system directory \Program Files\Microsoft ASP.NET\ASP.NET 2.0 AJAX Extensions\v1.0.61025\web.config
If you don’t do that, then you will get the ‘sys is undefine’ javascript error. And your ajax will not work fine.
So, cheer up, now your application is ready to implement the ajax in .net 2.0. First place a “script manager” and then a “update panel”. Now Any control in the update panel if request to server, then only the part within the update panel will submitted to the server. This is very useful when your page is frequently postback.

Ref:
http://www.afteredge.com/post/UpdatePanel-Extender-for-ASPNET-AJAX.aspx
http://www.asp.net/AJAX/Documentation/Live/tutorials/UsingUpdatePanelControls.aspx
http://www.asp.net/AJAX/Documentation/Live/overview/AsynchronousLayerOverview.aspx

download here!

Thursday, October 30, 2008

Caching in ASP.NET











ASP.NET provides support for page, partial page (Fragment), and data caching. Caching a page that is dynamically generated, called page output caching . In page caching when a pages is cached that dynamically generated only the first time it is accessed. Any subsequent access to the same page will be returned from the cache.. ASP.NET also allow to Cached a portion of a page called Partial page caching or Fragment Caching . Other server data are cached (e.g. SQL Server data, XML Data ) that can be easily accessed with out re-retrieving that data using data Caching . caching reduce number of round trip of database and other data source. ASP.NET provides a full-featured data cache engine, complete with support for scavenging (based on cache priority) , expiration, and file and key , Time dependencies . There are two locations where caching can be used to improve performance in ASP.NET applications.
Ref Fig 1 : Caching Opportunity in ASP.NET
In above picture (1) is used for return caching of page that means its used in output caching and (2) save the round trip by storing the data using data caching.
ASP.NET supports two types of expiration policies, which determine when an object will be expired or removed from the cache. These two policies are described as follows:
Absolute expiration: Determines that the expirations occur at a specified time. Absolute expirations are specified in full-time format (hh:mm:ss). The object will be expired from the cache at the specified time.
ASP.NET Supports Three Type of Caching
Page Output caching [Output caching ]
Fragment caching [Output caching ]
Data Caching
Different Types of Caching
1 . Page Output Caching : Before starting Page Output caching we need to know the compilation process of a page, because based on the generation of page we should able to understand why should we used caching . ASPX Page compiled in two stage process. First, the code is compiled into the Microsoft Intermediate Language (MSIL). Then, the MSIL is compiled into native code (by JIT Compiler ) during execution. and entire code in an ASP.NET Web page is compiled into MSIL when we built the sites , but at the time of execution only the portion of MSIL Converted to native code which is need by user or user request, which also improve performance.

Ref Fig 2 : Caching Opportunity in ASP.NET
Now what ever we are getting , if there is some page which change frequently JIT need compile every time. So, We can use Page output for those page which content are relatively static. So rather than generating the page on each user request we can cached the page using Page output caching so that it can be access from cache itself. so, Instead of pages can be generated once and then cached for subsequent fetches.Page output caching allows the entire content of a given page to be stored in the cache.
Ref Fig 3: Caching Opportunity in ASP.NET
As from given picture, when the first request is generated page is been cached and for same page request page should be retrieve from cache itself rather that regenerating the page.For Output caching , OutputCache directive can be added to any ASP.NET page, specifying the duration (in seconds) that the page should be cached.

ADO.NET



1. What is ADO.NET?
ADO.NET is the new database technology of the .NET (Dot Net) platform, and it builds on Microsoft ActiveX® Data Objects (ADO).
ADO is a language-neutral object model that is the keystone of Microsoft's Universal Data Access strategy.
ADO.NET defines DataSet and DataTable objects which are optimized for moving disconnected sets of data across intranets and Internets, including through firewalls. It also includes the traditional Connection and Command objects, as well as an object called a DataReader that resembles a forward-only, read-only ADO recordset. If you create a new application, your application requires some form of data access most of the time.
ADO.NET provides data access services in the Microsoft .NET platform.
Data Provider for SQL Server (System.Data.SqlClient).
Data Provider for OLEDB (System.Data.OleDb).
Data Provider for ODBC (System.Data.Odbc).
Data Provider for Oracle (System.Data.OracleClient).
ADO.NET is a set of classes that expose data access services to the .NET developer. The ADO.NET classes are found in System.Data.dll and are integrated with the XML classes in System.Xml.dll.
There are two central components of ADO.NET classes: the DataSet, and the .NET Framework Data Provider.
Data Provider is a set of components including:
the Connection object (SqlConnection, OleDbConnection, OdbcConnection, OracleConnection)
the Command object (SqlCommand, OleDbCommand, OdbcCommand, OracleCommand)
the DataReader object (SqlDataReader, OleDbDataReader, OdbcDataReader, OracleDataReader)
and the DataAdapter object (SqlDataAdapter, OleDbDataAdapter, OdbcDataAdapter, OracleDataAdapter).
DataSet object represents a disconnected cache of data which is made up of DataTables and DataRelations that represent the result of the command.
In any .NET data access page, before you connect to a database, you first have to import all the necessary namespaces that will allow you to work with the objects required. As we're going to work with SQL Server, we'll first import the namespaces we need. Namespaces in .NET are simply a neat and orderly way of organizing objects, so that nothing becomes ambiguous.
<%@ Import Namespace="System" %><%@ Import Namespace="System.Data" %><%@ Import Namespace="System.Data.SqlClient" %>
Note: If we were using a database other than SQL, for instance, MS Access, we would then replace the SQLClient with OleDb. If we use Oracle, .NET v 1.1 provides the System.Data.OracleClient namespace, and for any ODBC data source it provides the System.Data.Odbc namespace. You'll find detailed information on all the available methods and objects we'll discuss in the .NET SDK Framework documentation.
Ref:

ASP.Net Page Life Cycle








Introduction
Many of us know that IIS is a web server and we use it in our .Net application since we need a web server to run a web application. But I wonder as many of us don't know the internal architecture of IIS. This article is written for beginners to know the architecture of IIS.
How the simple web page execution happens?
As all of us know a request comes from Client (Browser) and sends to Server (we call it as Web server) in turn server process the request and sends response back to the client in according to the client request.
But internally in the web server there is quite interesting process that happens. To get aware of that process we should first of all know about the architecture of the IIS
It mainly consists of 3 Parts/Files
Inetinfo.exec
ISAPI Filer (Container for Internet Server Application Interface dlls),
Worker Process (aspnet_wp.exe)
When ever a request comes from the client:
Inetinfo.exe is the ASP.Net request handler that handles the requests from the client .If it's for static resources like HTML files or image files inetinfo.exe process the request and sent to client. If the request is with extension aspx/asp, inetinfo.exe processes the request to API filter. ISAPI filter will have several runtime modules called as ISAPI extensions. To process the request ISAPI filter takes the help of these runtime modules. The runtime module loaded for ASP page is asp.dll. And for ASP.NET page it's ASPNET_ISAPI.dll. From here the request is processed to the "worker process". Worker Process will have several application domains.
Application Domain
The purpose of the application domain is in order to isolate one application from another. When ever we create a new application, application domains are created automatically by the CLRHost. Worker process will create a block of memory related to particular application. Application domains provide a more secure and versatile unit of processing that the common language runtime can use to provide isolation between applications. Application domains are normally created by runtime hosts. Runtime host is responsible for bootstrapping the common language runtime before an application is run.
Worker process sends the request to HTTPPIPE line.(HTTP Pipeline is nonetheless collection of .net framework classes). HTTP Pipeline compiles the request into a library and makes a call to HTTP runtime and runtime creates an instance of page class
Public Class File
Inherits System.Web.UI.Page
End Class 'File
ASP.Net web page is a class derived from page class, this page class resides in system.web.dll
After creating instance pf page class HTTP Runtime immediately invokes process request method of page class
Dim Req As New Page

Req.ProcessRequest()
Process Request Method does following things:
Intialize the memory
Load the view state
Page execution and post back events
Rendering HTML content
Releasing the memory
Process Request Method executes set of events for page class .These are called as Page life cycle events.
Page Life Cycle Events
Page_InitThe server controls are loaded and initialized from the Web form's view state. This is the first step in a Web form's life cycle.
Page_LoadThe server controls are loaded in the page object. View state information is available at this point, so this is where you put code to change control settings or display text on the page.
Page_PreRenderThe application is about to render the page object.
Page_UnloadThe page is unloaded from memory.
Page_DisposedThe page object is released from memory. This is the last event in the life of a page object.
Page_ErrorAn unhandled exception occurs.
Page_AbortTransactionA transaction is aborted.
Page_CommitTransactionA transaction is accepted.
Page_DataBindingA server control on the page binds to a data source.
Process Request Method finally renders HTML Page
Dependencies:
When the request comes to ASP.net worker process, it will be forwarded to HTTP Application factory. This "Application Factory" will maintain address of the application domains which are currently executing under worker process. If the required virtual directory application domain is unavailable it will create a new application domain. If the application domain is already existing, the request will be forwarded to corresponding AppDomain.
Application Domain maintains page handler factory class. This will contain all libraries addresses corresponding to webpage. If the requested webpage library is available the instance of the page class is created, if the library is unavailable the request will be forwarded to HTTP pipeline.
Please ref the following links.

Wednesday, October 29, 2008

Quick Reference / Links

Group Name : Dot Net Reference
Dot net tree
http://www.obout.com/
ASP Web Mail
http://www.webmailasp.net/webmailsoftwareFAQ.asp
HTML Editor
http://tinymce.moxiecode.com/example_full.php?example=true
Function to check numeric data
http://www.codeproject.com/useritems/Check_For_Numeric_Format.asp
Increase Session
http://radio.javaranch.com/pascarello/2005/07/05/1120592884938.html
.NET Framework
http://www.andymcm.com/dotnetfaq.htm

Group Name : Ajax
AJAX Extension Toolkit
http://www.afteredge.com/post/UpdatePanel-Extender-for-ASPNET-AJAX.aspx
http://www.asp.net/AJAX/Documentation/Live/tutorials/UsingUpdatePanelControls.aspx
http://www.asp.net/AJAX/Documentation/Live/overview/AsynchronousLayerOverview.aspx

AJAX Control Toolkit
sp.net/ajax/ajaxcontroltoolkit/samples/

http://aspalliance.com/articles/LearnAJAX.aspx

Group Name : Funny
Royal Chowdary
http://www.royalchowdary.com/phpBB/
All Code projects
http://studyonnet.com/database.asp
Prasad Multiplex (IMax)
http://www.pradadz.com/
all messengers in one site
http://www7.meebo.com/
Indian Child
http://www.indianchild.com/nursery%20rhymes.htm
Manthena Satyanarayana Raju
http://www.teluguone.com/health/manthena/index.jsp?filename=video.jsp

Group Name : XML
XML Online Class (Voice)
http://www.microsoft.com/seminar/mmcfeed/MMCDisplayFeed.asp?Sort=title&Lang=en&Product=103337&Audience=100402&

Group Name : Mail Servers
Yahoo Mail
http://mail.yahoo.com/
Rediff Mail
http://www.rediffmail.com/
Tigra Menu
http://www.softcomplex.com/download.html

Group Name : Interview Tips
.Net Interview tips
http://blogs.crsw.com/mark/articles/254.aspx#

Group Name : SQL Server
Interview Tips
http://vyaskn.tripod.com/iq.htm
SQL Team
http://www.sqlteam.com/
SQL in Simple English
http://www.codecoffee.com/articles/sql1.html
xml webcast
http://support.microsoft.com/default.aspx?scid=%2Fservicedesks%2Fwebcasts%2Fwc121801%2Fwcblurb121801%2Easp
SQL JOb Schedule
http://www.aspfaq.com/show.asp?id=2403
Send email from SQL Server
http://www.planet-source-code.com/vb/scripts/ShowCode.asp?txtCodeId=670&lngWId=5

Group Name : -Others-
Sanskrit Meanings
http://webapps.uni-koeln.de/tamil/
Z-index+Flash file+javascript menu
http://forums.swishzone.com/lofiversion/index.php?t24607.html
Connection Strings
http://www.connectionstrings.com/
Tata Indicom
http://210.210.24.66/tataindicom/html/instantPay_billdetails.htm
HTML Codes
http://ascii.cl/htmlcodes.htm
search site like google.com
http://www.dhoondho.com/
Online SBI
https://www.onlinesbi.com/login.html
Infanetsolutions
http://www.infanetsolutions.com/index.html
Active User Count
http://www.asptutorial.info/script/activeuserscounter/
Crackspider.net
http://crackspider.net/search.shtml?sid=2005053016&q=winzip
Creak
http://docrack.com/
whois
http://www.allwhois.com/cgi-bin/allwhois.cgi
Remote Proceudre Call (RPC)
http://securityresponse.symantec.com/avcenter/venc/data/w32.blaster.worm.html

Books download free
http://seshareddy.com/www.flazx.com

Blood Donors
http://seshareddy.com/www.indianblooddonors.com
BSNL
https://portal.bsnl.in/Vas/login.asp
Excillion
http://excillion.westin.smashservers.net:8086/
Like Google Earth
http://www.wikimapia.org/
EMI
http://hsbc.co.in/in/personal/loans/emicalc_personal.htm
Easy Movies
http://202.63.109.98/Moviesindex.asp
Trick and Tips Tutorial Blogger

http://trick-blog.blogspot.com/2008/01/make-blog-more-expressive-with-auto.html

How to Improve ASP.Net Web Application Performance

1) Turn off Tracing unless until required Tracing is one of the wonderful features which enable us to track the application's trace and the sequences. However, again it is useful only for developers and you can set this to "false" unless you require to monitor the trace logging. How it affects performance: Enabling tracing adds performance overhead and might expose private information, so it should be enabled only while an application is being actively analyzed. Solution: When not needed, tracing can be turned off using 2) Turn off Session State, if not required One extremely powerful feature of ASP.NET is its ability to store session state for users, such as a shopping cart on an e-commerce site or a browser history. How it affects performance: Since ASP.NET Manages session state by default, you pay the cost in memory even if you don't use it. I.e. whether you store your data in in-process or on state server or in a Sql Database, session state requires memory and it's also time consuming when you store or retrieve data from it. Solution: You may not require session state when your pages are static or when you do not need to store information captured in the page. In such cases where you need not use session state, disable it on your web form using the directive, <@%Page EnableSessionState="false"%> In case you use the session state only to retrieve data from it and not to update it, make the session state read only by using the directive, <@%Page EnableSessionState ="ReadOnly"%> 3) Disable View State of a Page if possible View state is a fancy name for ASP.NET storing some state data in a hidden input field inside the generated page. When the page is posted back to the server, the server can parse, validate, and apply this view state data back to the page's tree of controls. View state is a very powerful capability since it allows state to be persisted with the client and it requires no cookies or server memory to save this state. Many ASP.NET server controls use view state to persist settings made during interactions with elements on the page, for example, saving the current page that is being displayed when paging through data. How it affects performance: § There are a number of drawbacks to the use of view state, however. It§ increases the total payload of the page both when served and when requested. There is also an additional overhead incurred when serializing or deserializing view state data that is posted back to the server. View state increases§ the memory allocations on the server. Several server controls, the most well known of which is the DataGrid, tend to make excessive use of view state, even in cases where it is not needed. Solution: Pages that do not have any server postback events can have the view state turned off. The default behavior of the ViewState property is enabled, but if you don't need it, you can turn it off at the control or page level. Within a control, simply set the EnableViewState property to false, or set it globally within the page using this setting: If you turn view state off for a page or control, make sure you thoroughly test your pages to verify that they continue to function correctly. 4) Set debug=false in web.config When you create the application, by default this attribute is set to "true" which is very useful while developing. However, when you are deploying your application, always set it to "false". How it affects performance: Setting it to "true" requires the pdb information to be inserted into the file and this results in a comparatively larger file and hence processing will be slow. Solution: Therefore, always set debug="false" before deployment. 5) Avoid Response.Redirect Response.Redirect () method simply tells the browser to visit another page. How it affects performance: Redirects are also very chatty. They should only be used when you are transferring people to another physical web server. Solution: For any transfers within your server, use .transfer! You will save a lot of needless HTTP requests. Instead of telling the browser to redirect, it simply changes the "focus" on the Web server and transfers the request. This means you don't get quite as many HTTP requests coming through, which therefore eases the pressure on your Web server and makes your applications run faster. Tradeoffs: § ".transfer" process can work on only those sites running on the server. Only Response.Redirect can do that. Server.Transfer maintains the§ original URL in the browser. This can really help streamline data entry techniques, although it may make for confusion when debugging 5. A) To reduce CLR Exceptions count, Use Response.Redirect (".aspx", false) instead of response.redirect (".aspx"). 6) Use the String builder to concatenate string How it affects performance: String is Evil when you want to append and concatenate text to your string. All the activities you do to the string are stored in the memory as separate references and it must be avoided as much as possible. i.e. When a string is modified, the run time will create a new string and return it, leaving the original to be garbage collected. Most of the time this is a fast and simple way to do it, but when a string is being modified repeatedly it begins to be a burden on performance: all of those allocations eventually get expensive. Solution: Use String Builder when ever string concatenation is needed so that it only stores the value in the original string and no additional reference is created. 7) Avoid throwing exceptions How it affects performance: Exceptions are probably one of the heaviest resource hogs and causes of slowdowns you will ever see in web applications, as well as windows applications. Solution: You can use as many try/catch blocks as you want. Using exceptions gratuitously is where you lose performance. For example, you should stay away from things like using exceptions for control flow. 8) Use Finally Method to kill resources The finally§ method gets executed independent of the outcome of the Block. Always use§ the finally block to kill resources like closing database connection, closing files and other resources such that they get executed independent of whether the code worked in Try or went to Catch. 9) Use Client Side Scripts for validations User Input is Evil and it must be thoroughly validated before processing to avoid overhead and possible injections to your applications. How It improves performance: Client site validation can help reduce round trips that are required to process user's request. In ASP.NET you can also use client side controls to validate user input. However, do a check at the Server side too to avoid the infamous Javascript disabled scenarios. 10) Avoid unnecessary round trips to the server How it affects performance: Round trips significantly affect performance. They are subject to network latency and to downstream server latency. Many data-driven Web sites heavily access the database for every user request. While connection pooling helps, the increased network traffic and processing load on the database server can adversely affect performance. Solution: Keep round trips to an absolute minimum§ § Implement Ajax UI whenever possible. The idea is to avoid full page refresh and only update the portion of the page that needs to be changed 11) Use Page.ISPostBack Make sure you don't execute code needlessly. Use Page.ISPostBack property to ensure that you only perform page initialization logic when a page is first time loaded and not in response to client postbacks. 12) Include Return Statements with in the Function/Method How it improves performance Explicitly using return allows the JIT to perform slightly more optimizations. Without a return statement, each function/method is given several local variables on stack to transparently support returning values without the keyword. Keeping these around makes it harder for the JIT to optimize, and can impact the performance of your code. Look through your functions/methods and insert return as needed. It doesn't change the semantics of the code at all, and it can help you get more speed from your application. 13) Use Foreach loop instead of For loop for String Iteration Foreach is far more readable, and in the future it will become as fast as a For loop for special cases like strings. Unless string manipulation is a real performance hog for you, the slightly messier code may not be worth it. 14) Avoid Unnecessary Indirection How it affects performance: When you use byRef, you pass pointers instead of the actual object. Many times this makes sense (side-effecting functions, for example), but you don't always need it. Passing pointers results in more indirection, which is slower than accessing a value that is on the stack. Solution: When you don't need to go through the heap, it is best to avoid it there by avoiding indirection. 15) Use "ArrayLists" in place of arrays How it improves performance An ArrayList as everything that is good about an array PLUS automatic sizing, Add, Insert, Remove, Sort, Binary Search. All these great helper methods are added when implementing the IList interface. Tradeoffs: The downside of an ArrayList is the need to cast objects upon retrieval. 16) Always check Page.IsValid when using Validator Controls Always make sure you check Page.IsValid before processing your forms when using Validator Controls. 17) Use Paging Take advantage of paging's simplicity in .net. Only show small subsets of data at a time, allowing the page to load faster. Tradeoffs: Just be careful when you mix in caching. Don't cache all the data in the grid. 18) Store your content by using caching How it improves performance: ASP.NET allows you to cache entire pages, fragment of pages or controls. You can cache also variable data by specifying the parameters that the data depends. By using caching you help ASP.NET engine to return data for repeated request for the same page much faster. When and Why Use Caching: A Proper use and fine tune of caching approach of caching will result on better performance and scalability of your site. However improper use of caching will actually slow down and consume lots of your server performance and memory usage. Good candidate to use caching is if you have infrequent chance of data or static content of web page. 19) Use low cost authentication Authentication can also have an impact over the performance of your application. For example passport authentication is slower than form-base authentication which in here turn is slower than Windows authentication. 20) Minimize the number of web server controls How it affects performance: The use of web server controls increases the response time of your application because they need time to be processed on the server side before they are rendered on the client side. Solution: One way to minimize the number of web server controls is to taking into consideration, the usage of HTML elements where they are suited, for example if you want to display static text. 21) Avoid using unmanaged code How it affects performance: Calls to unmanaged code are a costly marshaling operation. Solution: Try to reduce the number calls between the managed and unmanaged code. Consider to do more work in each call rather than making frequent calls to do small tasks. 22) Avoid making frequent calls across processes If you are working with distributed applications, this involves additional overhead negotiating network and application level protocols. In this case network speed can also be a bottleneck. Try to do as much work as possible in fewer calls over the network. 23) Cleaning Up Style Sheets and Script Files A quick and easy way to improve your§ web application's performance is by going back and cleaning up your CSS Style Sheets and Script Files of unnecessary code or old styles and functions. It is common for old styles and functions to still exist in your style sheets and script files during development cycles and when improvements are made to a website. Many websites use a single CSS Style Sheet or Script File for the§ entire website. Sometimes, just going through these files and cleaning them up can improve the performance of your site by reducing the page size. If you are referencing images in your style sheet that are no longer used on your website, it's a waste of performance to leave them in there and have them loaded each time the style sheet is loaded. Run a web page analyzer against pages in§ your website so that you can see exactly what is being loaded and what takes the most time to load. 24) Design with ValueTypes Use simple structs when you can, and when you don't do a lot of boxing and unboxing. Tradeoffs: ValueTypes are far less flexible than Objects, and end up hurting performance if used incorrectly. You need to be very careful about when you treat them like objects. This adds extra boxing and unboxing overhead to your program, and can end up costing you more than it would if you had stuck with objects. 25) Minimize assemblies Minimize the number of assemblies you use to keep your working set small. If you load an entire assembly just to use one method, you're paying a tremendous cost for very little benefit. See if you can duplicate that method's functionality using code that you already have loaded. 26) Encode Using ASCII When You Don't Need UTF By default, ASP.NET comes configured to encode requests and responses as UTF-8. If ASCII is all your application needs, eliminated the UTF overhead can give you back a few cycles. Note that this can only be done on a per-application basis. 27) Avoid Recursive Functions / Nested Loops These are general things to adopt in any programming language, which consume lot of memory. Always avoid Nested Loops, Recursive functions, to improve performance. 28) Minimize the Use of Format () When you can, use toString () instead of format (). In most cases, it will provide you with the functionality you need, with much less overhead. 29) Place StyleSheets into the Header Web developers who care about performance want browser to load whatever content it has as soon as possible. This fact is especially important for pages with a lot of content and for users with slow Internet connections. When the browser loads the page progressively the header, the logo, the navigation components serve as visual feedback for the user. When we place style sheets near the bottom part of the html, most browsers stop rendering to avoid redrawing elements of the page if their styles change thus decreasing the performance of the page. So, always place StyleSheets into the Header 30) Put Scripts to the end of Document Unlike StyleSheets, it is better to place scripts to the end of the document. Progressive rendering is blocked until all StyleSheets have been downloaded. Scripts cause progressive rendering to stop for all content below the script until it is fully loaded. Moreover, while downloading a script, browser does not start any other component downloads, even on different hostnames. So,always have scripts at the end of the document. 31) Make JavaScript and CSS External Using external files generally produces faster pages because the JavaScript and CSS files are cached by the browser. Inline JavaScript and CSS increases the HTML document size but reduces the number of HTTP requests. With cached external files, the size of the HTML is kept small without increasing the number of HTTP requests thus improving the performance. Tips For Database Operations1) Return Multiple Resultsets The database code if has request paths that go to the database more than once then, these round-trips decreases the number of requests per second your application can serve. Solution: Return multiple resultsets in a single database request, so that you can cut the total time spent communicating with the database. You'll be making your system more scalable, too, as you'll cut down on the work the database server is doing managing requests. 2) Connection Pooling and Object Pooling Connection pooling is a useful way to reuse connections for multiple requests, rather than paying the overhead of opening and closing a connection for each request. It's done implicitly, but you get one pool per unique connection string. Make sure you call Close or Dispose on a connection as soon as possible. When pooling is enabled, calling Close or Dispose returns the connection to the pool instead of closing the underlying database connection. Account for the following issues when pooling is a part of your design: Share connections§ Avoid per-user logons to the§ database Do not vary connection strings§ Do not cache connections§ 3) Use SqlDataReader Instead of Dataset wherever it is possible If you are reading a table sequentially you should use the DataReader rather than DataSet. DataReader object creates a read only stream of data that will increase your application performance because only one row is in memory at a time. 4) Keep Your Datasets Lean Remember that the dataset stores all of its data in memory, and that the more data you request, the longer it will take to transmit across the wire. Therefore Only put the records you need into the dataset. 5) Avoid Inefficient queries How it affects performance: Queries that process and then return more columns or rows than necessary, waste processing cycles that could best be used for servicing other requests. Cause of Inefficient queries: Too§ much data in your results is usually the result of inefficient queries. § The SELECT * query often causes this problem. You do not usually need to return all the columns in a row. Also, analyze the WHERE clause in your queries to ensure that you are not returning too many rows. Try to make the WHERE clause as specific as possible to ensure that the least number of rows are returned. § Queries that do not take advantage of indexes may also cause poor performance. 6) Unnecessary round trips How it affects performance: Round trips significantly affect performance. They are subject to network latency and to downstream server latency. Many data-driven Web sites heavily access the database for every user request. While connection pooling helps, the increased network traffic and processing load on the database server can adversely affect performance. Solution: Keep round trips to an absolute minimum. 7) Too many open connections Connections are an expensive and scarce resource, which should be shared between callers by using connection pooling. Opening a connection for each caller limits scalability. Solution: To ensure the efficient use of connection pooling, avoid keeping connections open and avoid varying connection strings. 8) Avoid Transaction misuse How it affects performance: If you select the wrong type of transaction management, you may add latency to each operation. Additionally, if you keep transactions active for long periods of time, the active transactions may cause resource pressure. Solution: Transactions are necessary to ensure the integrity of your data, but you need to ensure that you use the appropriate type of transaction for the shortest duration possible and only where necessary. 9) Avoid Over Normalized tables Over Normalized tables may require excessive joins for simple operations. These additional steps may significantly affect the performance and scalability of your application, especially as the number of users and requests increases. 10) Reduce Serialization Dataset serialization is more efficiently implemented in .NET Framework version 1.1 than in version 1.0. However, Dataset serialization often introduces performance bottlenecks. You can reduce the performance impact in a number of ways: Use column name aliasing§ Avoid§ serializing multiple versions of the same data Reduce the number of§ DataTable objects that are serialized 11) Do Not Use CommandBuilder at Run Time How it affects performance: CommandBuilder objects such as as SqlCommandBuilder and OleDbCommandBuilder are useful when you are designing and prototyping your application. However, you should not use them in production applications. The processing required to generate the commands affects performance. Solution: Manually create stored procedures for your commands, or use the Visual Studio® .NET design-time wizard and customize them later if necessary. 12) Use Stored Procedures Whenever Possible Stored procedures are highly optimized tools that result in§ excellent performance when used effectively. Set up stored procedures to§ handle inserts, updates, and deletes with the data adapter Stored§ procedures do not have to be interpreted, compiled or even transmitted from the client, and cut down on both network traffic and server overhead. Be sure§ to use CommandType.StoredProcedure instead of CommandType.Text 13) Avoid Auto-Generated Commands When using a data adapter, avoid auto-generated commands. These require additional trips to the server to retrieve meta data, and give you a lower level of interaction control. While using auto-generated commands is convenient, it's worth the effort to do it yourself in performance-critical applications. 14) Use Sequential Access as Often as Possible With a data reader, use CommandBehavior.SequentialAccess. This is essential for dealing with blob data types since it allows data to be read off of the wire in small chunks. While you can only work with one piece of the data at a time, the latency for loading a large data type disappears. If you don't need to work the whole object at once, using Sequential Access will give you much better performance.

Stored Proce Optimiz Tips

Use stored procedures instead of heavy-duty queries.This can reduce network traffic, because your client will send to server only stored procedure name (perhaps with some parameters) instead of large heavy-duty queries text. Stored procedures can be used to enhance security and conceal underlying data objects also. For example, you can give the users permission to execute the stored procedure to work with the restricted set of the columns and data.
*****

Include the SET NOCOUNT ON statement into your stored procedures to stop the message indicating the number of rows affected by a Transact-SQL statement.This can reduce network traffic, because your client will not receive the message indicating the number of rows affected by a Transact-SQL statement.
*****

Call stored procedure using its fully qualified name.The complete name of an object consists of four identifiers: the server name, database name, owner name, and object name. An object name that specifies all four parts is known as a fully qualified name. Using fully qualified names eliminates any confusion about which stored procedure you want to run and can boost performance because SQL Server has a better chance to reuse the stored procedures execution plans if they were executed using fully qualified names.
*****

Consider returning the integer value as an RETURN statement instead of an integer value as part of a recordset.The RETURN statement exits unconditionally from a stored procedure, so the statements following RETURN are not executed. Though the RETURN statement is generally used for error checking, you can use this statement to return an integer value for any other reason. Using RETURN statement can boost performance because SQL Server will not create a recordset.
*****

Don't use the prefix "sp_" in the stored procedure name if you need to create a stored procedure to run in a database other than the master database.The prefix "sp_" is used in the system stored procedures names. Microsoft does not recommend to use the prefix "sp_" in the user-created stored procedure name, because SQL Server always looks for a stored procedure beginning with "sp_" in the following order: the master database, the stored procedure based on the fully qualified name provided, the stored procedure using dbo as the owner, if one is not specified. So, when you have the stored procedure with the prefix "sp_" in the database other than master, the master database is always checked first, and if the user-created stored procedure has the same name as a system stored procedure, the user-created stored procedure will never be executed.
*****

Use the sp_executesql stored procedure instead of the EXECUTE statement.The sp_executesql stored procedure supports parameters. So, using the sp_executesql stored procedure instead of the EXECUTE statement improve readability of your code when there are many parameters are used. When you use the sp_executesql stored procedure to executes a Transact-SQL statements that will be reused many times, the SQL Server query optimizer will reuse the execution plan it generates for the first execution when the change in parameter values to the statement is the only variation.
*****

Use sp_executesql stored procedure instead of temporary stored procedures.Microsoft recommends to use the temporary stored procedures when connecting to earlier versions of SQL Server that do not support the reuse of execution plans. Applications connecting to SQL Server 7.0 or SQL Server 2000 should use the sp_executesql system stored procedure instead of temporary stored procedures to have a better chance to reuse the execution plans.
*****

If you have a very large stored procedure, try to break down this stored procedure into several sub-procedures, and call them from a controlling stored procedure.The stored procedure will be recompiled when any structural changes were made to a table or view referenced by the stored procedure (for example, ALTER TABLE statement), or when a large number of INSERTS, UPDATES or DELETES are made to a table referenced by a stored procedure. So, if you break down a very large stored procedure into several sub-procedures, you get chance that only a single sub-procedure will be recompiled, but other sub-procedures will not.
*****

Try to avoid using temporary tables inside your stored procedure.Using temporary tables inside stored procedure reduces the chance to reuse the execution plan.
*****

Try to avoid using DDL (Data Definition Language) statements inside your stored procedure.Using DDL statements inside stored procedure reduces the chance to reuse the execution plan.
*****

Add the WITH RECOMPILE option to the CREATE PROCEDURE statement if you know that your query will vary each time it is run from the stored procedure.The WITH RECOMPILE option prevents reusing the stored procedure execution plan, so SQL Server does not cache a plan for this procedure and the procedure is recompiled at run time. Using the WITH RECOMPILE option can boost performance if your query will vary each time it is run from the stored procedure because in this case the wrong execution plan will not be used.
*****
Use SQL Server Profiler to determine which stored procedures has been recompiled too often.To check the stored procedure has been recompiled, run SQL Server Profiler and choose to trace the event in the "Stored Procedures" category called "SP:Recompile". You can also trace the event "SP:StmtStarting" to see at what point in the procedure it is being recompiled. When you identify these stored procedures, you can take some correction actions to reduce or eliminate the excessive recompilations

SQL Server Optimization Tips

1. General Tips• Try to restrict the queries result set by returning only the particular columns from the table, not all table's columns.
This can results in good performance benefits, because SQL Server will return to client only particular columns, not all table's columns. This can reduce network traffic and boost the overall performance of the query.
• Try to avoid using SQL Server cursors, whenever possible.
SQL Server cursors can result in some performance degradation in comparison with select statements. Try to use correlated subquery or derived tables, if you need to perform row-by-row operations.
• If you need to return the total table's row count, you can use alternative way instead of SELECT COUNT (*) statement.
Because SELECT COUNT (*) statement make a full table scan to return the total table's row count, it can take very many time for the large table. There is another way to determine the total row count in a table. You can use sysindexes system table, in this case. There is ROWS column in the sysindexes table. This column contains the total row count for each table in your database. So, you can use the following select statement instead of SELECT COUNT (*): SELECT rows FROM sysindexes WHERE id = OBJECT_ID ('table_name') AND indid < 2 So, you can improve the speed of such queries in several times.
• Use table variables instead of temporary tables.
Table variables require less locking and logging resources than temporary tables, so table variables should be used whenever possible. The table variables are available in SQL Server 2000 only.
• Try to avoid using the DISTINCT clause, whenever possible.
Because using the DISTINCT clause will result in some performance degradation, you should use this clause only when it is necessary.
• Include SET NOCOUNT ON statement into your stored procedures to stop the message indicating the number of rows affected by a T-SQL statement.
This can reduce network traffic, because your client will not receive the message indicating the number of rows affected by a T-SQL statement.
• Use the select statements with TOP keyword or the SET ROWCOUNT statement, if you need to return only the first n rows.
This can improve performance of your queries, because the smaller result set will be returned. This can also reduce the traffic between the server and the clients.
• Try to use UNION ALL statement instead of UNION, whenever possible.
The UNION ALL statement is much faster than UNION, because UNION ALL statement does not look for duplicate rows, and UNION statement does look for duplicate rows, whether or not they exist.
• Try to use constraints instead of triggers, whenever possible.
Constraints are much more efficient than triggers and can boost performance. So, you should use constraints instead of triggers, whenever possible.
• Use user-defined functions to encapsulate code for reuse.
The user-defined functions (UDFs) contain one or more Transact-SQL statements that can be used to encapsulate code for reuse. Using UDFs can reduce network traffic.
• You can specify whether the index keys are stored in ascending or descending order.
For example, using the CREATE INDEX statement with the DESC option (descending order) can increase the speed of queries, which return rows in the descending order. By default, the ascending order is used.
• If you need to delete all tables’ rows, consider using TRUNCATE TABLE instead of DELETE command.
Using the TRUNCATE TABLE is much fast way to delete all tables’ rows, because it removes all rows from a table without logging the individual row deletes.
• Don't use Enterprise Manager to access remote servers over a slow link or to maintain very large databases.
Because using Enterprise Manager is very resource expensive, use stored procedures and T-SQL statements, in this case.
• Use SQL Server cursors to allow your application to fetch a small subset of rows instead of fetching all tables’ rows.
SQL Server cursors allow application to fetch any block of rows from the result set, including the next n rows, the previous n rows, or n rows starting at a certain row number in the result set. Using SQL Server cursors can reduce network traffic because the smaller result set will be returned.

2. Tips for designing Tables
• Try to use constraints instead of triggers, rules, and defaults whenever possible.
Constraints are much more efficient than triggers and can boost performance. Constraints are more consistent and reliable in comparison to triggers, rules and defaults, because you can make errors when you write your own code to perform the same actions as the constraints.
• Use char/varchar columns instead of nchar/nvarchar if you do not need to store Unicode data.
The char/varchar value uses only one byte to store one character; the nchar/nvarchar value uses two bytes to store one character, so the char/varchar columns use two times less space to store data in comparison with nchar/nvarchar columns.
• If you work with SQL Server 2000, use cascading referential integrity constraints instead of triggers whenever possible.
For example, if you need to make cascading deletes or updates, specify the ON DELETE or ON UPDATE clause in the REFERENCES clause of the CREATE TABLE or ALTER TABLE statements. The cascading referential integrity constraints are much more efficient than triggers and can boost performance.

3. Tips for designing Stored Procedures
• Use stored procedures instead of heavy-duty queries.
This can reduce network traffic, because your client will send to server only stored procedure name (perhaps with some parameters) instead of large heavy-duty queries text. Stored procedures can be used to enhance security and conceal underlying data objects also. For example, you can give the users permission to execute the stored procedure to work with the restricted set of the columns and data.
• Call stored procedure using its fully qualified name.
The complete name of an object consists of four identifiers: the server name, database name, owner name, and object name. An object name that specifies all four parts is known as a fully qualified name. Using fully qualified names eliminates any confusion about which stored procedure you want to run and can boost performance because SQL Server has a better chance to reuse the stored procedures execution plans if they were executed using fully qualified names
• Consider returning the integer value as an RETURN statement instead of an integer value as part of a recordset.
The RETURN statement exits unconditionally from a stored procedure, so the statements following RETURN are not executed. Though the RETURN statement is generally used for error checking, you can use this statement to return an integer value for any other reason. Using RETURN statement can boost performance because SQL Server will not create a recordset.
• Don't use the prefix "sp_" in the stored procedure name if you need to create a stored procedure to run in a database other than the master database.
The prefix "sp_" is used in the system stored procedures names. Microsoft does not recommend using the prefix "sp_" in the user-created stored procedure name, because SQL Server always looks for a stored procedure beginning with "sp_" in the following order: the master database, the stored procedure based on the fully qualified name provided, the stored procedure using dbo as the owner, if one is not specified. So, when you have the stored procedure with the prefix "sp_" in the database other than master, the master database is always checked first, and if the user-created stored procedure has the same name as a system stored procedure, the user-created stored procedure will never be executed.
• Use the sp_executesql stored procedure instead of the EXECUTE statement.
The sp_executesql stored procedure supports parameters. So, using the sp_executesql stored procedure instead of the EXECUTE statement improve readability of your code when there are many parameters are used. When you use the sp_executesql stored procedure to execute a Transact-SQL statement that will be reused many times, the SQL Server query optimizer will reuse the execution plan it generates for the first execution when the change in parameter values to the statement is the only variation.

4. Tips for designing Cursors
• Do not forget to close SQL Server cursor when its result set is not needed.
To close SQL Server cursor, you can use CLOSE {cursor_name} command. This command releases the cursor result set and frees any cursor locks held on the rows on which the cursor is positioned.
• Do not forget to deallocate SQL Server cursor when the data structures comprising the cursor are not needed.
To deallocate SQL Server cursor, you can use DEALLOCATE {cursor_name} command. This command removes a cursor reference and releases the data structures comprising the cursor.
• Try to reduce the number of columns to process in the cursor.
Include in the cursor's select statement only necessary columns. It will reduce the cursor result set. So, the cursor will use fewer resources. It can increase cursor performance and reduce SQL Server overhead.
• Use READ ONLY cursors, whenever possible, instead of updatable cursors.
Because using cursors can reduce concurrency and lead to unnecessary locking, try to use READ ONLY cursors, if you do not need to update cursor result set.
• Try avoiding using insensitive, static and keyset cursors, whenever possible.
These types of cursor produce the largest amount of overhead on SQL Server, because they cause a temporary table to be created in TEMPDB, which results in some performance degradation.
• Use FAST_FORWARD cursors, whenever possible.
The FAST_FORWARD cursors produce the least amount of overhead on SQL Server, because there are read-only cursors and can only be scrolled from the first to the last row. Use FAST_FORWARD cursor if you do not need to update cursor result set and the FETCH NEXT will be the only used fetch option.
• Use FORWARD_ONLY cursors, if you need updatable cursor and the FETCH NEXT will be the only used fetch option.
If you need read-only cursor and the FETCH NEXT will be the only used fetch option, try to use FAST_FORWARD cursor instead of FORWARD_ONLY cursor. By the way, if one of the FAST_FORWARD or FORWARD_ONLY is specified the other cannot be specified.
5. Tips for Indexes
• Consider creating index on column(s) frequently used in the WHERE, ORDER BY, and GROUP BY clauses.
These column(s) are best candidates for index creating. You should analyze your queries very attentively to avoid creating not useful indexes.
• Drop indexes that are not used.
Because each index take up disk space and slow the adding, deleting, and updating of rows, you should drop indexes that are not used. You can use Index Wizard to identify indexes that are not used in your queries.
• Try to create indexes on columns that have integer values rather than character values.
Because the integer values usually have less size then the characters values size (the size of the int data type is 4 bytes, the size of the bigint data type is 8 bytes), you can reduce the number of index pages which are used to store the index keys. This reduces the number of reads required to read the index and boost overall index performance.
• Limit the number of indexes, if your application updates data very frequently.
Because each index take up disk space and slow the adding, deleting, and updating of rows, you should create new indexes only after analyze the uses of the data, the types and frequencies of queries performed, and how your queries will use the new indexes. In many cases, the speed advantages of creating the new indexes outweigh the disadvantages of additional space used and slowly rows modification. However, avoid using redundant indexes; create them only when it is necessary. For read-only table, the number of indexes can be increased.
• Check that index you tried to create does not already exist.
Keep in mind that when you create primary key constraint or unique key constraints SQL Server automatically creates index on the column(s) participate in these constraints. If you specify another index name, you can create the indexes on the same column(s) again and again.
• Create clustered index instead of nonclustered to increase performance of the queries that return a range of values and for the queries that contain the GROUP BY or ORDER BY clauses and return the sort results.
Because every table can have only one clustered index, you should choose the column(s) for this index very carefully. Try to analyze all your queries, choose most frequently used queries and include into the clustered index only those column(s), which provide the most performance benefits from the clustered index creation.
• Create nonclustered indexes to increase performance of the queries that return few rows and where the index has good selectivity.
In comparison with a clustered index, which can be only one for each table, each table can have as many as 249 nonclustered indexes. However, you should consider nonclustered index creation as carefully as the clustered index, because each index take up disk space and drag on data modification.
• Avoid creating a clustered index based on an incrementing key.
For example, if a table has surrogate integer primary key declared as IDENTITY and the clustered index was created on this column, then every time data is inserted into this table, the rows will be added to the end of the table. When many rows will be added a "hot spot" can occur. A "hot spot" occurs when many queries try to read or write data in the same area at the same time. A "hot spot" results in I/O bottleneck.
Note. By default, SQL Server creates clustered index for the primary key constraint. So, in this case, you should explicitly specify NONCLUSTERED keyword to indicate that a nonclustered index is created for the primary key constraint.
• Create a clustered index for each table.
If you create a table without clustered index, the data rows will not be stored in any particular order. This structure is called a heap. Every time data is inserted into this table, the row will be added to the end of the table. When many rows will be added a "hot spot" can occur. To avoid "hot spot" and improve concurrency, you should create a clustered index for each table.
• If you create a composite (multi-column) index, try to order the columns in the key so that the WHERE clauses of the frequently used queries match the column(s) that are leftmost in the index.
The order of the columns in a composite (multi-column) index is very important. The index will be used to evaluate a query only if the leftmost index key's column are specified in the WHERE clause of the query. For example, if you create composite index such as "Name, Age", then the query with the WHERE clause such as "WHERE Name = 'Alex'" will use the index, but the query with the WHERE clause such as "WHERE Age = 28" will not use the index.
• If you need to join several tables very frequently, consider creating index on the joined columns.
This can significantly improve performance of the queries against the joined tables.
• If your application will perform the same query over and over on the same table, consider creating a covering index including columns from this query.
A covering index is an index, which includes all of the columns referenced in the query. So the creating covering index can improve performance because all the data for the query is contained within the index itself and only the index pages, not the data pages, will be used to retrieve the data. Covering indexes can bring a lot of performance to a query, because it can save a huge amount of I/O operations.
• Use the SQL Server Profiler Create Trace Wizard with "Identify Scans of Large Tables" trace to determine which tables in your database may need indexes.
This trace will show which tables are being scanned by queries instead of using an index.

SQL Optimization Tips

• Use views and stored procedures instead of heavy-duty queries.
This can reduce network traffic, because your client will send to
server only stored procedure or view name (perhaps with some
parameters) instead of large heavy-duty queries text. This can be used
to facilitate permission management also, because you can restrict
user access to table columns they should not see.

• Try to use constraints instead of triggers, whenever possible.
Constraints are much more efficient than triggers and can boost
performance. So, you should use constraints instead of triggers,
whenever possible.

• Use table variables instead of temporary tables.
Table variables require less locking and logging resources than
temporary tables, so table variables should be used whenever possible.
The table variables are available in SQL Server 2000 only.

• Try to use UNION ALL statement instead of UNION, whenever possible.
The UNION ALL statement is much faster than UNION, because UNION ALL
statement does not look for duplicate rows, and UNION statement does
look for duplicate rows, whether or not they exist.

• Try to avoid using the DISTINCT clause, whenever possible.
Because using the DISTINCT clause will result in some performance
degradation, you should use this clause only when it is necessary.

• Try to avoid using SQL Server cursors, whenever possible.
SQL Server cursors can result in some performance degradation in
comparison with select statements. Try to use correlated sub-query or
derived tables, if you need to perform row-by-row operations.

• Try to avoid the HAVING clause, whenever possible.
The HAVING clause is used to restrict the result set returned by the
GROUP BY clause. When you use GROUP BY with the HAVING clause, the
GROUP BY clause divides the rows into sets of grouped rows and
aggregates their values, and then the HAVING clause eliminates
undesired aggregated groups. In many cases, you can write your select
statement so, that it will contain only WHERE and GROUP BY clauses
without HAVING clause. This can improve the performance of your query.

• If you need to return the total table's row count, you can use
alternative way instead of SELECT COUNT(*) statement.
Because SELECT COUNT(*) statement make a full table scan to return the
total table's row count, it can take very many time for the large
table. There is another way to determine the total row count in a
table. You can use sysindexes system table, in this case. There is
ROWS column in the sysindexes table. This column contains the total
row count for each table in your database. So, you can use the
following select statement instead of SELECT COUNT(*): SELECT rows
FROM sysindexes WHERE id = OBJECT_ID('table_name') AND indid < 2 So,
you can improve the speed of such queries in several times.

• Include SET NOCOUNT ON statement into your stored procedures to stop
the message indicating the number of rows affected by a T-SQL statement.
This can reduce network traffic, because your client will not receive
the message indicating the number of rows affected by a T-SQL statement.

• Try to restrict the queries result set by using the WHERE clause.
This can results in good performance benefits, because SQL Server will
return to client only particular rows, not all rows from the table(s).
This can reduce network traffic and boost the overall performance of
the query.

• Use the select statements with TOP keyword or the SET ROWCOUNT
statement, if you need to return only the first n rows.
This can improve performance of your queries, because the smaller
result set will be returned. This can also reduce the traffic between
the server and the clients.

• Try to restrict the queries result set by returning only the
particular columns from the table, not all table's columns.
This can results in good performance benefits, because SQL Server will
return to client only particular columns, not all table's columns.
This can reduce network traffic and boost the overall performance of
the query.
1.Indexes
2.avoid more number of triggers on the table
3.unnecessary complicated joins
4.correct use of Group by clause with the select list
5 In worst cases Denormalization


Index Optimization tips

• Every index increases the time in takes to perform INSERTS, UPDATES
and DELETES, so the number of indexes should not be very much. Try to
use maximum 4-5 indexes on one table, not more. If you have read-only
table, then the number of indexes may be increased.

• Keep your indexes as narrow as possible. This reduces the size of
the index and reduces the number of reads required to read the index.

• Try to create indexes on columns that have integer values rather
than character values.

• If you create a composite (multi-column) index, the order of the
columns in the key are very important. Try to order the columns in the
key as to enhance selectivity, with the most selective columns to the
leftmost of the key.

• If you want to join several tables, try to create surrogate integer
keys for this purpose and create indexes on their columns.

• Create surrogate integer primary key (identity for example) if your
table will not have many insert operations.

• Clustered indexes are more preferable than nonclustered, if you need
to select by a range of values or you need to sort results set with
GROUP BY or ORDER BY.

• If your application will be performing the same query over and over
on the same table, consider creating a covering index on the table.

• You can use the SQL Server Profiler Create Trace Wizard with
"Identify Scans of Large Tables" trace to determine which tables in
your database may need indexes. This trace will show which tables are
being scanned by queries instead of using an index.

• You can use sp_MSforeachtable undocumented stored procedure to
rebuild all indexes in your database. Try to schedule it to execute
during CPU idle time and slow production periods.
sp_MSforeachtable @command1="print '?' DBCC DBREINDEX ('?')"

NET Framework













.Net Framework basics
When we speak about .Net, we mean by .NET framework. .NET Framework is made up of the Common Language Runtime (CLR), the Base Class Library (System Classes). This allows us to build our own services (Web Services or Windows Services) and Web Applications (Web forms Or Asp .Net), and Windows applications (Windows forms). We can see how this is all put together.?

Above Picture shows overall picture, demonstrating how the .NET languages follows rules provided by the Common Language Specifications (CLS). These languages can all be used?Independently to build application and can all be used with built-in data describers (XML) and data assessors (ADO .NET and SQL). Every component of the .NET Framework can take advantage of the large pre- built library of classes called the Framework Class Library (FCL). Once everything is put together, the code that is created is executed in the Common Language Runtime. Common Language Runtime is designed to allow any .NET-compliant language to execute its code. At the time of writing, these languages included VB .Net, C# and C++ .NET, but any language can become .NET- compliant, if they follow CLS rules. The following sections will address each of the parts of the architecture.
.Net Common Language Specifications (CLS):
In an object-oriented environment, everything is considered as an object. (This point is explained in this article and the more advanced features are explained in other articles.) You create a template for an object (this is called the class file), and this class file is used to create multiple objects.
TIP: Consider a Rectangle. You may want to create many Rectangle in your lifetime; but each Rectangle will have certain characteristics and certain functions. For example, each rectangle will have a specific width and color. So now, suppose your friend also wants to create a Rectangle. Why reinvent the Rectangle? You can create a common template and share it with others. They create the Rectangle based on your template. This is the heart of object-oriented programming?the template is the class file, and the Rectangle is the objects built from that class. Once you have created an object, your object needs to communicate with many other Objects.
Even if it is created in another .NET language doesn?t matter, because each language follows the rules of the CLS. The CLS defines the necessary things as common variable types (this is called the Common Type System CTS ), common visibility like when and where can one see these variables, common method specifications, and so on. It doesn?t have one rule which tells how C# composes its objects and another rule tells how VB .Net does the same thing . To steal a phrase, there is now ?One rule to bind them all.? One thing to note here is that the CLS simply provides the bare rules. Languages can adhere to their own specification. In this case, the actual compilers do not need to be as powerful as those that support the full CLS.
The Common Language Runtime (CLR):
The heart of .net Framework is Common Language Runtime (CLR). All .NET-compliant languages run in a common, managed runtime execution environment. With the CLR, you can rely on code that is accessed from different languages. This is a huge benefit. One coder can write one module in C#, and another can access and use it from VB .Net. Automatic object management, the .NET languages take care of memory issues automatically. These are the few listed?benefits which you get from CLR.
Microsoft Intermediate Language (MSIL):
So how can many different languages be brought together and executed together??Microsoft Intermediate Language (MSIL) or, as it?s more commonly known, Intermediate Language (IL). In its simplest terms, IL is a programming language.?If you wanted to, you could write IL directly, compile it, and run it. But why would want to write such low level code? Microsoft has provided with higher-level languages, such as C#, that one can use. Before the code is executed, the MSIL must be converted into platform-specific code. The CLR includes something called a JIT compiler in which the compiler order is as follows.
Source Code => Compiler => Assembley =>Class Loader =>Jit Compiler =>Manged Native Code=>Execution.
The above is the order of compilation and execution of programs. Once a program is written in a .Net compliant language, the rest all is the responsibility of the frame work.



Ref :
http://www.codersource.net/basics_dot_net_framework.html





download here!