http://blog.1SmartSolution.com/1SmartSolution Blog1SmartSolution BlogSmall JDBC gotchaFew days ago i was playing with MyConnect Server and had to install the JDBC driver for MSSQL in order to save speed tests data to a database. And still no data got inserted into the DB. Checking the logs i found that "Java Runtime Environment (JRE) version 1.7 is not supported by this driver. Use the sqljdbc4.jar class library" which was weird as JDBC 4 is exactly what i downloaded and installed. The answer came after checking driver's installation folder. Apparently, because there was both the
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=343&Small-JDBC-gotcha
Thu, 18 Jul 2013 15:46:00 -0400Few days ago i was playing with MyConnect Server and had to install the JDBC driver for MSSQL in order to save speed tests data to a database. And still no data got inserted into the DB. Checking the How To Get Primary Key In MS SQLSome time ago had a nightmare when trying to create Full-Text Index for a table with a "weird" name for the Primary Key. And because i did not have access to DB structure i had to find a way to get PKs info programmatically. So, for my later reference and for anyone that will have similar problems, here is the code to find the Primary Key name in MS SQL:
select *
from sysobjects
where xtype='pk' and
parent_obj in (select id from sysobjects where name='tablename')
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=341&How-To-Get-Primary-Key-In-MS-SQL
Wed, 19 Sep 2012 20:34:00 -0400Some time ago had a nightmare when trying to create Full-Text Index for a table with a "weird" name for the Primary Key. And because i did not have access to DB structure i had to find a way to get PKIs your site still vulnerable to SQLi attacks?A new report by iMPERVA shows that
from 2005 through today, SQL injection has been responsible for 83% of successful hacking-related data breaches.
Based on the sites they were monitoring, over half of attacks originate in United States.
Country
Number of Originating Attacks
%
United States
48176
58
Sweden
8850
11
China
6709
8
Great Britain
4970
6
Vietnam
2412
3
Netherlands
1963
2
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=328&Is-your-site-still-vulnerable-to-SQLi-attacks
Wed, 02 Nov 2011 11:38:00 -0400A new report by iMPERVA shows that
from 2005 through today, SQL injection has been responsible for 83% of successful hacking-related data breaches.
Based on the sites they were monitoring, over halfWhen Much Is TOO MUCHGot today an interesting error message:
[Macromedia][SQLServer JDBC Driver][SQLServer]The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Too many parameters were provided in this RPC request. The maximum is 2100.
When checking the code, i found out that in the lie where the error happened i have a query with an INSERT in loop. Why make say 10 queries with INSERTs while you can have 1 query with 10 INSERTs? But it seems like i got in a situation wh
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=325&When-Much-Is-TOO-MUCH
Mon, 18 Apr 2011 11:18:00 -0400Got today an interesting error message:
[Macromedia][SQLServer JDBC Driver][SQLServer]The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Too many parametColdFusion. MSSQL. CSV Import. Performance.Had an interesting task last days. To make a script that would access a couple of links, get from each one zip file, extract CSVs from them and then import the recors to a MSSQL Database.
So with just a few ColdFusion tags like CFSET, CFHTTP, CFZIP, CFLOOP, CFDIRECTORY and CFQUERY it's a piece of cake to make it work. BUT, (yeah yeah again a BUT hehe) because the CSVs had a lot of records and it had to be done repeatedly for many of them, it came to be a serious issue from performance perspecti
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=322&ColdFusion-MSSQL-CSV-Import-Performance
Wed, 12 Jan 2011 13:42:00 -0400Had an interesting task last days. To make a script that would access a couple of links, get from each one zip file, extract CSVs from them and then import the recors to a MSSQL Database.
So with jusQuickie: Operators Order Of Efficiency
Equals: =
This one is the most efficient SQL query, especially if the WHERE clause is covered by an index.
Greater than, less than: >, >=,
Note that Microsoft SQL Server rewrites a BETWEEN to >= and .
Like: LIKE
Note that if you start with a wildcard, then SQL Server will do a table scan, whether there is an index or not, such as: LIKE '%E'.
Different from: , NOT
Note that Microsoft SQL Server rewrites a NOT to .
Note that SQL Server will do a ta
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=320&Quickie-Operators-Order-Of-Efficiency
Thu, 16 Dec 2010 14:45:00 -0400
Equals: =
This one is the most efficient SQL query, especially if the WHERE clause is covered by an index.
Greater than, less than: >, >=,
Note that Microsoft SQL Server rewriteChange Column Collation In MSSQLSometimes may happen in a database there to be fields with different Collations and it will make you problems. Previously i wrote about a way to get over this issue directly in your queries. Other option is to change the collation on the sql server for that "problem" column.
Here is an example of how to do it:
Alter Table yourTable
Alter Column yourColumn varchar(1000)
Collate Latin1_General_CI_AS
Easy :)
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=316&Change-Column-Collation-In-MSSQL
Sat, 04 Dec 2010 21:41:00 -0400Sometimes may happen in a database there to be fields with different Collations and it will make you problems. Previously i wrote about a way to get over this issue directly in your queries. Other opHow to repair a SUSPECT SQL databaseThis weekend had a hair pulling moment when noted that on one of my sites the DB was down with a "SUSPECT" mark on it. Checking through logs told me that at restart replication didn't went well. Tried many different things to solve the problem but nothing worked till found a short list of commands that worked just FINE! maybe i lost a couple of broken records but that's fine taking in mind that nothing else worked and i was about to lose everything.
So here is the magic code:
EXEC sp_resetst
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=312&How-to-repair-a-SUSPECT-SQL-database
Mon, 22 Nov 2010 12:47:00 -0400This weekend had a hair pulling moment when noted that on one of my sites the DB was down with a "SUSPECT" mark on it. Checking through logs told me that at restart replication didn't went well. TriedMaking A Faster BETWEEN DatesSometime, finding records that exist between 2 dates can be slow especially when the check is done to a JOINed table as in the following example:
Select tbl1.fld1, tbl1.fld2, tbl2.fld3
From tbl1
Inner Join tbl2 On tbl2.dateFld BETWEEN tbl1.startDate And tbl1.endDate
And here is a simple way to increase the performance with a simple change:
Select tbl1.fld1, tbl1.fld2, tbl2.fld3
From tbl1
Inner Join tbl2 On tbl1.startDate
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=308&Making-A-Faster-BETWEEN-Dates
Tue, 02 Nov 2010 10:48:00 -0400Sometime, finding records that exist between 2 dates can be slow especially when the check is done to a JOINed table as in the following example:
Select tbl1.fld1, tbl1.fld2, tbl2.fld3
From tbl1
Dealing With DuplicatesOften there are situations where duplicate rows exist in a table and we need to get those that are unique or delete the duplicates. Earlier this year i had a post about Partitioning, now let's see how it works. In Microsoft SQL Server 2005 has been added the Row_Number() Over(Partition By...Order by...) feature and it can be used efficiently for such situations.
First let's build the scenario.
Create the table
create table Emp_Details
( Emp_Name varchar(10)
, Company varchar(15)
,
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=305&Dealing-With-Duplicates
Wed, 22 Sep 2010 14:05:00 -0400Often there are situations where duplicate rows exist in a table and we need to get those that are unique or delete the duplicates. Earlier this year i had a post about Partitioning, now let's see howSub-Query GotchaWhen you need for example to compare the value of a table field with say the resulting Sum of a sub-query you may get unexpected results when the subquery has no records. But using a simple IsNull around the Sum would solve this issue. So a query like:
Select fld1, fld2
From myTbl
Where fld3 >
(
Select Sum(fld1)
From myTbl2
Where fld2 = myTbl.fld2
)
would become:
Select fld1, fld2
From myTbl
Where fld3 >
(
Select IsNull(Sum(fld1), 0)
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=304&SubQuery-Gotcha
Mon, 20 Sep 2010 14:43:00 -0400When you need for example to compare the value of a table field with say the resulting Sum of a sub-query you may get unexpected results when the subquery has no records. But using a simple IsNull aroOn Query DateTime Fields PerformanceAfter the changes i made 2 weeks agoregarding queries performance, few days ago i started to note that that same page was starting to perform very bad. Within this days loading time grew from 5-10 seconds to 18-20 seconds and then to over 50 seconds (to be said that the page run over 80 queries and QoQ some queries being very big with lots of joins and sub-queries). Yes there has been added rows to particular tables that made the total number of records in the play to grow essentially. Yes the D
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=303&On-Query-DateTime-Fields-Performance
Thu, 02 Sep 2010 21:07:00 -0400After the changes i made 2 weeks agoregarding queries performance, few days ago i started to note that that same page was starting to perform very bad. Within this days loading time grew from 5-10 secOptimization. Order. Performance.Yesterday i got a query that was performing VERY bad. It had 4 joined tables and 4 subqueries. Nothing REALLY complicated, but it was taking 80-100 seconds to execute. That's right, about 1.5minutes!
First thing that came to my head and usually it helps was: indexes. I created them, but.... they didn't help much in this case.
So i tried to analyze execution plan. What took my attention was that one of the sub-queries had an over 20% cost. But it was a simple one so nothing really could be done
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=298&Optimization-Order-Performance
Fri, 20 Aug 2010 19:18:00 -0400Yesterday i got a query that was performing VERY bad. It had 4 joined tables and 4 subqueries. Nothing REALLY complicated, but it was taking 80-100 seconds to execute. That's right, about 1.5minutes!
Little SQL Trick: Find the Port a Connection is UsingThis hsould help you determine on which port(s) connections were coming through from the client to the database server:
SELECT c.session_id, c.local_tcp_port, s.login_name, s.host_name, s.program_name
FROM sys.dm_exec_connections AS c INNER JOIN
sys.dm_exec_sessions AS s on c.session_id = s.session_id
WHERE c.local_tcp_port 1433
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=292&Little-SQL-Trick-Find-the-Port-a-Connection-is-Using
Wed, 07 Jul 2010 17:26:00 -0400This hsould help you determine on which port(s) connections were coming through from the client to the database server:
SELECT c.session_id, c.local_tcp_port, s.login_name, s.host_name, s.program_nSolution to System.OutOfMemoryException Exception in MS SQL 2005Been trying to run a quite big sql script on MS SQL 2005 getting Exception of type System.OutOfMemoryException was thrown. (mscorlib)
The solution is simple, run the following CMD command:
sqlcmd -S YOURSQLSERVER\INSTANCENAME -U YOURUSERNAME -P YOURPASSWORD -d YOURDATABASE -i "c:\somescript.sql"
Note that the parameters like -U and -P are case sensitive!
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=286&Solution-to-SystemOutOfMemoryException-Exception-in-MS-SQL-2005
Mon, 10 May 2010 11:24:00 -0400Been trying to run a quite big sql script on MS SQL 2005 getting Exception of type System.OutOfMemoryException was thrown. (mscorlib)
The solution is simple, run the following CMD command:
sqlcmd Ways To Retrieve Data From An Audit Table
Retrieving information from a history table for a given point in time is a common data task. For example, using a table that contains all historical prices,
what where the prices of all products as they were 30 days ago? There are a number of ways to different methods to obtain this information from the audit table.
Sql-Server-Performance.com had a nice article in March comparing 5 ways of Retrieving Data from an Audit Table:
Using a subquery to determine which record for each prod
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=283&Ways-To-Retrieve-Data-From-An-Audit-Table
Fri, 09 Apr 2010 17:24:00 -0400
Retrieving information from a history table for a given point in time is a common data task. For example, using a table that contains all historical prices,
what where the prices of all products aTable Variables vs. Temporary TablesThere often appear questions about which is better and of course the answer is: It Depends.
Bellow is a summary comparing Table Variables and Temporary Tables side by side.SummaryFeatureTable VariablesTemporary TablesScopeCurrent batchCurrent session, nested stored procedures. Global: all sessions.UsageUDFs, Stored Procedures, Triggers, Batches.Stored Procedures, Triggers, Batches.CreationDECLARE statement only.CREATE TABLE statement.SELECT INTO statement.Table nameMaximum 128 characters.Maximu
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=282&Table-Variables-vs-Temporary-Tables
Fri, 09 Apr 2010 16:50:00 -0400There often appear questions about which is better and of course the answer is: It Depends.
Bellow is a summary comparing Table Variables and Temporary Tables side by side.SummaryFeatureTable VariablMonitor or Report?
By Larry Gonick
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=281&Monitor-or-Report
Wed, 07 Apr 2010 11:49:00 -0400
By Larry GonickSQL UPDATE With Auto IncrementRecently, working on a project, i needed to make an Update of N rows setting a field with an auto incremented value. You know, like counting rows is different groups.
The solution came to be quite simple.
SO, the initial query of:
Update leadOffers
Set completed = <cfqueryparam cfsqltype="CF_SQL_INTEGER" value="1">
, pageid = <cfqueryparam cfsqltype="CF_SQL_INTEGER" value="#pageid#">
WHERE leadid = <cfqueryparam cfsqltype="CF_SQL_INTEGER" value="#leadID#">
and offerID in (#of
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=274&SQL-UPDATE-With-Auto-Increment
Mon, 22 Mar 2010 20:45:00 -0400Recently, working on a project, i needed to make an Update of N rows setting a field with an auto incremented value. You know, like counting rows is different groups.
The solution came to be quite siData Access Optimization In SQL Server: Partitioning
Apply partitioning on big and fat tables
Table partitioning means splitting a large table into multiple smaller tables so that queries has to scan less amount data while retrieving. This is very useful when you have very large (possibly having millions of records) table in your database and when you see that after all optimization steps, querying on this table is still executing slowly. Following two following options are available to partition a table.
Horizontal partiti
http://blog.1SmartSolution.com//index.cfm?action=posts.entry&id=259&Data-Access-Optimization-In-SQL-Server-Partitioning
Tue, 12 Jan 2010 21:15:00 -0400
Apply partitioning on big and fat tables
Table partitioning means splitting a large table into multiple smaller tables so that queries has to scan less amount data while retrieving. This is