SQL Server Installation CenterOnce you install SQL Server on your machine, you end up seeing this tool under Configuration Tools. This is a very handy SQL Server tool for DBAs and developers, as you can see important SQL Server resources. It also mentions different installation options which are available in SQL Server, as well as options for installation.
Tuesday, September 10, 2013
Sunday, September 8, 2013
SQL Server management tools that makes life easier (4)
SQL Server Configuration ManagerSQL Server Configuration Manager can be used to manage all SQL Server services. This SQL Server tool can network protocols such as shared memory, named pipes and TCP/IP. Configuration Manager can also manage network connectivity configuration from SQL Server client machines. It is always advised to start, stop, pause and resume all SQL Services using this tool. As a best practice, one should always change the service account or passwords using the SQL Server Configuration tool.
Saturday, September 7, 2013
SQL Server management tools that makes life easier (3)
Reporting Services Configuration ManagerReporting Services Configuration Manager can create and change settings for the Report Server and Report Manager. If you have installed SQL Server Reporting Services in install-only mode, then after the installation you will have to use Reporting Services Configuration Manager to configure the report server for native mode. If you have installed Report Server using the install-and-configure option, then you can use this SQL Server management tool to verify and modify the existing settings. The tool can configure a local or remote report server instance, and can configure the service account in which the report server service should be run. You can configure Web Service and Report Manger URLs, as well as create, configure and manage Report Server Databases such as ReportServer and ReportServerTempDB databases. Other features include:
- Configuring email settings on the report server to send out reports as an email attachment.
- Configuring the unattended execution account so that it can be used for remote connections during scheduled operations during scenarios when user credentials are not available.
- Backup and restore or replace the symmetric key that is used to encrypt connection stings and credentials.
Wednesday, September 4, 2013
SQL Server management tools that makes life easier (2)
SQL Server Management Studio
Microsoft first introduced SQL Server Management Studio (SSMS) in SQL Server 2005. Database developers use this SQL Server management tool to develop T-SQL queries; create objects such as tables, indexes, constraints, stored procedures, functions and triggers; and debug T-SQL code. At the same time, database administrators use SSMS to perform maintenance tasks, such as index rebuild, index recognize, backup and restore, and security management. You can even create various scripts for analysis services and manage SQL Server Database Engine, SQL Server Integration Services and Reporting Services.
Microsoft first introduced SQL Server Management Studio (SSMS) in SQL Server 2005. Database developers use this SQL Server management tool to develop T-SQL queries; create objects such as tables, indexes, constraints, stored procedures, functions and triggers; and debug T-SQL code. At the same time, database administrators use SSMS to perform maintenance tasks, such as index rebuild, index recognize, backup and restore, and security management. You can even create various scripts for analysis services and manage SQL Server Database Engine, SQL Server Integration Services and Reporting Services.
Sunday, September 1, 2013
SQL Server management tools that makes life easier
Business Intelligence Development Studio (BIDS)
Microsoft introduced Business Intelligence Development Studio (BIDS) in SQL Server 2005. BIDS is a SQL Server management tool built to help developers who are using SQL Server Integration Services, Reporting Services and Analysis Services. BIDS is nothing but Microsoft Visual Studio along with project templates specific to SQL Server Business Intelligence.
Microsoft introduced Business Intelligence Development Studio (BIDS) in SQL Server 2005. BIDS is a SQL Server management tool built to help developers who are using SQL Server Integration Services, Reporting Services and Analysis Services. BIDS is nothing but Microsoft Visual Studio along with project templates specific to SQL Server Business Intelligence.
SQL Server Data Tools (SSDT)
(SSDT) is a replacement for SQL Server BIDS, starting from the release of SQL Server 2012. This SQL Server management tool has all the features of BIDS and has certain new enhancements such as:
(SSDT) is a replacement for SQL Server BIDS, starting from the release of SQL Server 2012. This SQL Server management tool has all the features of BIDS and has certain new enhancements such as:
- The Data Compare feature, which allows you to compare and synchronize data between two databases.
- Support for SQL Server Unit Testing, which allows one to generate unit tests for SQL Server functions, triggers and stored procedures.
- Object Explorer, which can create, edit, delete, can rename tables, functions, triggers and stored procedures and can even perform certain level of database administration tasks.
Saturday, August 31, 2013
MIT develops 110-core processor (experimental)
A 110-core chip has been developed by Massachusetts Institute of Technology as it looks for power-efficient ways to boost performance in mobile devices, PCs and servers.
The processor, called the Execution Migraine Machine, tries to determine ways to reduce traffic inside chips, which enables faster and more power-efficient computing, said Mieszko Lis, a postgraduate student and Ph.D. candidate at MIT, during a presentation at the Hot Chips conference in California.
The chip is a general purpose processor and not an accelerator like a graphics processor. Typically a lot of data migration takes place between cores and cache, and the 110-core chip has replaced the cache with a shared memory pool, which reduces the data transfer channels. The chip is also able to predict data movement trends, which reduces the number of cycles required to transfer and process data.
The benefits of power-efficient data transfers could apply to mobile devices and databases,For example, data-traffic reduction will help mobile devices efficiently process applications like video, while saving power. It could also help reduce the amount of data sent by a mobile device over a network.
Fewer threads and predictive data behavior could help speed up databases. It could also free up shared resources for other tasks, Lis said.
The researchers have seen up to 14 times the reduction in on-chip traffic, which significantly reduces power dissipation. According to internal benchmarks, the performance was 25% better compared to other processors, Lis said..... Lis did not specify the competitive processors used for benchmarks.
The chip has a mesh architecture with the 110 cores interconnected in a square design. It is based on custom architecture designed to deal with large data sets and to make data migration easier, Lis said. The code was also written specially to work with the processor.
The processor, called the Execution Migraine Machine, tries to determine ways to reduce traffic inside chips, which enables faster and more power-efficient computing, said Mieszko Lis, a postgraduate student and Ph.D. candidate at MIT, during a presentation at the Hot Chips conference in California.
The chip is a general purpose processor and not an accelerator like a graphics processor. Typically a lot of data migration takes place between cores and cache, and the 110-core chip has replaced the cache with a shared memory pool, which reduces the data transfer channels. The chip is also able to predict data movement trends, which reduces the number of cycles required to transfer and process data.
The benefits of power-efficient data transfers could apply to mobile devices and databases,For example, data-traffic reduction will help mobile devices efficiently process applications like video, while saving power. It could also help reduce the amount of data sent by a mobile device over a network.
Fewer threads and predictive data behavior could help speed up databases. It could also free up shared resources for other tasks, Lis said.
The researchers have seen up to 14 times the reduction in on-chip traffic, which significantly reduces power dissipation. According to internal benchmarks, the performance was 25% better compared to other processors, Lis said..... Lis did not specify the competitive processors used for benchmarks.
The chip has a mesh architecture with the 110 cores interconnected in a square design. It is based on custom architecture designed to deal with large data sets and to make data migration easier, Lis said. The code was also written specially to work with the processor.
Friday, August 30, 2013
Developers hack Dropbox, show how to access to user data
Two developers have cracked Dropbox's security, even intercepting SSL data from its servers and bypassing the cloud storage provider's two-factor authentication, according to a paper they published at USENIX 2013.
https://www.usenix.org/system/files/conference/woot13/woot13-kholia.pdf
The paper presents "new and generic techniques to reverse engineer frozen Python applications, which are not limited to just the Dropbox world," the developers wrote
https://www.usenix.org/system/files/conference/woot13/woot13-kholia.pdf
The paper presents "new and generic techniques to reverse engineer frozen Python applications, which are not limited to just the Dropbox world," the developers wrote
Thursday, August 29, 2013
What is Location Intelligence?
Location Intelligence is a business intelligence (BI) tool capability that relates
geographic contexts to business data. Like BI, location intelligence software is designed to turn
data into insight for a host of business purposes. Such tools draw on a variety of data sources,
such as geographic information systems (GIS), aerial maps, demographic information and, in some
cases, an organization's own databases.
The term Location Intelligence is often used to describe tools and
data employed to geographically “map” information. These mapping
applications can transform large amounts of data into color-coded visual
representations that make it easy to see trends and generate meaningful
intelligence. The creation of location intelligence is directed by
domain knowledge, formal frameworks, and a focus on decision support.
Saturday, August 24, 2013
VMware, Citrix and Microsoft virtual desktops get encryption security
AFORE Solutions today announced encryption
software aimed at securing data in virtualized environments where
Microsoft Windows applications are used, including virtualized desktop infrastructure deployments based on VMware, Citrix or Microsoft VDI.
AFORE's
CypherX software can be used by either cloud providers on behalf of
their customers or directly by the enterprise users in a private cloud
deployment, according to the security firm's chairman and chief strategy
officer, Jon Reeves. "This is intended for secure storage in the
cloud," Reeves said about CypherX. "It sits between the application and
the operating system itself in order to lock down applications. It
encrypts all information flowing in and out, the file system, and the
network or the clipboard."
Friday, August 23, 2013
What is ,,,, Hadoop?
Hadoop is a free, Java-based programming framework that supports the processing of large data
sets in a distributed computing environment. It is part of the Apache project sponsored by
the Apache Software Foundation.
Hadoop makes it possible to run applications on systems with thousands of nodes involving thousands of terabytes. Its distributed file system facilitates rapid data transfer rates among nodes and allows the system to continue operating uninterrupted in case of a node failure. This approach lowers the risk of catastrophic system failure, even if a significant number of nodes become inoperative.
Hadoop makes it possible to run applications on systems with thousands of nodes involving thousands of terabytes. Its distributed file system facilitates rapid data transfer rates among nodes and allows the system to continue operating uninterrupted in case of a node failure. This approach lowers the risk of catastrophic system failure, even if a significant number of nodes become inoperative.
Thursday, August 22, 2013
What is BIG DATA
Big data is the term for a collection of data sets
so large and complex that it becomes difficult to process using on-hand
database management tools or traditional data processing applications.
The challenges include capture, storage, search, sharing, transfer, analysis, and visualization.
As of 2012, limits on the size of data sets that are feasible to process in a reasonable amount of time were on the order of exabytes of data. The world's technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s; as of 2012, every day 2.5 quintillion (2.5×1018) bytes of data were created.
Big data is difficult to work with using most relational database management systems and desktop statistics and visualization packages, requiring instead "massively parallel software running on tens, hundreds, or even thousands of servers". What is considered "big data" varies depending on the capabilities of the organization managing the set, and on the capabilities of the applications that are traditionally used to process and analyze the data set in its domain. "For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration."
As of 2012, limits on the size of data sets that are feasible to process in a reasonable amount of time were on the order of exabytes of data. The world's technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s; as of 2012, every day 2.5 quintillion (2.5×1018) bytes of data were created.
Big data is difficult to work with using most relational database management systems and desktop statistics and visualization packages, requiring instead "massively parallel software running on tens, hundreds, or even thousands of servers". What is considered "big data" varies depending on the capabilities of the organization managing the set, and on the capabilities of the applications that are traditionally used to process and analyze the data set in its domain. "For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration."
Researchers outwit Apple, plant malware in the App Store !
A team of researchers from Georgia Tech has
demonstrated how hackers can slip a malicious app by Apple's reviewers
so that it's published to the App Store and ready for unsuspecting
victims to download.
Led by Tielei Wang, a research scientist at Georgia Tech's school of computer science, the team created a "Jekyll" app that posed as a benign news reader. Hidden inside the app, however, were code fragments, dubbed "gadgets," that self-assembled to create a proof-of-concept exploit only after the app was approved by Apple.
The assembled attack code was able to send tweets, email and texts without the user's knowledge, and could steal the iPhone's unique device ID, turn on the camera and take video, forward voice calls to other phones and connect with local Bluetooth devices. Because the reconfigured app also "phoned home" to a server operated by the researchers, they were able to download additional malware and compromise other apps on the smartphone, including the Safari browser.
Wednesday, August 21, 2013
Tuesday, August 20, 2013
Light News
- Cisco cuts 4,000 staff :Networking market leader Cisco has announced 4,000 job losses, despite reporting record results for the quarter.
- Microsoft has increased its global smartphone market share to 3.3%.
- Microsoft will release Windows 8.1 on October 18, almost a year since the company introduced Windows 8, its first-tablet optimized OS.
Monday, August 19, 2013
Data on Cloud , how secure?
"You
have no way of knowing. You can't trust anybody. Everybody is lying to
you," said security expert Bruce Schneier. "How do you know which
platform to trust? They could even be lying because the U.S. government
has forced them to."
How to be sure that your data is secure on cloud with all the law enforcement that grant access to governments and intelligence?
How to be sure that your data is secure on cloud with all the law enforcement that grant access to governments and intelligence?
Network Monitoring and Troubleshooting for Dummies
Shared on this link
http://www.4shared.com/office/Lqi9QM_X/Network_Monitoring_and_Trouble.html
Apache Struts vulnerabilities
"Chinese hackers are using an automated tool
to exploit known vulnerabilities in Apache Struts, in order to install
backdoors on servers hosting applications developed with the framework.
Apache Struts is a popular open-source framework for developing Java-based Web applications that's maintained by the Apache Software Foundation."
when I was looking for details for the Apache vulnerabilities , I found this database.
http://www.cvedetails.com/vulnerability-list/vendor_id-45/product_id-6117/Apache-Struts.html
Insecure feeling !!
Apache Struts is a popular open-source framework for developing Java-based Web applications that's maintained by the Apache Software Foundation."
when I was looking for details for the Apache vulnerabilities , I found this database.
http://www.cvedetails.com/vulnerability-list/vendor_id-45/product_id-6117/Apache-Struts.html
Insecure feeling !!
Subscribe to:
Posts (Atom)