Difference between revisions of "USETIM 2009"

From Openresearch
Jump to: navigation, search
(Committees)
(Kasia)
 
Line 1: Line 1:
{{Event
+
this depends on the opatering system you are using.lets assume you are a sensible person and are using a truly Unix system like FreeBSD or GNU/Linux.you need to firstly create a user/group for the database to run from. this user and group should only have access to the directories needed to run the program and host the database. also you want to have a firewall, which depending on what exactly your box is doing (some people like to run multiple services on one box generally considered a bad idea unless special circumstances.)this firewall needs to only allow the bare minimum of services through. you want to create a script that backs up the home directory of the database user because you will obviously find all the configuration files here, as well as the database itself (if you put it there, and you should) you want to put this script in the crontab and set it to run as often as you feel necessary. if you have a lot of changes, you want to set it to run daily, figure a time of least activity.now depending on the database itself, you might need/want to create users (through the database, not through the OS) for accessing the data, although if its nothing personal/sensitive you may get away with a guest log in.and remember, the rule of least privilege.this means all your programs and scripts should only be exicuted(ing) with a minimal privilege level, nothing more.also check out Tiger. its a cool program that makes all kinds of log files and lets you view them all at one place. it also runs some rootkit checkers.then there is the good old 60% rule. if you system is running over 60% of anything at any given time keep a sharp eye. it doesn't mean anything bad but its just a general rule of thumb that you should have room to handle more then your expected load just in case.anyways use common sense if you only have a few people accessing this database but your network traffic is high, something is up.
|Acronym=USETIM 2009
 
|Title=Using Search Engine Technology for Information Management
 
|Type=Workshop
 
|Field=Business informatics
 
|Superevent=VLDB 2009
 
|Start date=2009/08/24
 
|End date=2009/08/24
 
|Homepage=www.wikicfp.com/cfp/servlet/event.showcfp?eventid=5101
 
|City=Lyon
 
|Country=France
 
|Submission deadline=2009/05/17
 
}}
 
For information management, databases offer precise, controlled access to data. But, they do not offer the easy-to-use search capabilities that most knowledge workers manipulate daily on sites such as Google. Access to information contained in databases is more difficult, and more restricted. One solution to this information bottleneck is to let search engines support the brunt of the work, by offloading information from the database into alternative infrastructures, such as that provided by search engine technology. Many business applications such as search, report generation and data analysis might be performed more efficiently on the replicated data without involving the native database technology, e.g. transactions. These offloaded databases, retaining some of their structure, can be recombined, mashed up, creating one-off, possibly disposable, databases, while the primary data is safe in the original database. This workshop will examine the limits and potentialities of use information retrieval and search engine technology for information management (IM) applications.
 
 
 
==Topics==
 
 
 
search engine as a database
 
 
 
business intelligence applications without OLAP
 
 
 
optimization of relational database search
 
 
 
affordances of search engine technology for database offloading
 
 
 
mashups for user applications
 
 
 
content aggregation systems
 
 
 
limits of database offloading
 
 
 
database connectors
 
 
 
access-optimized databases
 
 
 
disposable databases
 
 
 
optimizing access, flexibility, and scalability
 
 
 
==Submissions==
 
 
 
Submission Deadline: May 17, 2009
 
 
 
==Important Dates==
 
Notification Due: May 29, 2009
 
Final Version Due: Jun 20, 2009
 
Conference: Aug 24, 2009
 
==Committees==
 
* Co-Organizers
 
* General Co-Chairs
 
** [[has general chair::Gregory Grefenstette]], Exalead, France
 
** [[has general chair::Wolfgang Nejdl]], University of Hannover, Germany
 
** [[has general chair::David Simmen]], IBM Almaden, USA
 
 
 
* Program Committee Members
 
** [[has PC member::Rakesh Agrawal]], Microsoft, USA
 
** [[has PC member::Hannah Bast]], Max-Planck Institute, Germany
 
** [[has PC member::Lukas Biewald]], Dolores Labs, USA
 
** [[has PC member::Stefano Ceri]], Politecnico de Milano, Italy
 
** [[has PC member::Eben Haber]], IBM, USA
 
** [[has PC member::Donald Kossmann]], ETHZ, Switzerland
 
** [[has PC member::Pankaj Mehra]], HP, Russia
 
** [[has PC member::Johannes Meinecke]], SAP, Germany
 
** [[has PC member::Guillaume Pierre]], Vrije Universiteit, Netherlands
 
** [[has PC member::Swami Sivasubramanian]], Amazon, USA
 
** [[has PC member::Qi Su]], Aster Data Systems, USA
 
** [[has PC member::Øystein Torbjørnsen]], FAST, Norway
 
** [[has PC member::Ingmar Weber]], EPFL, Switzerand
 

Latest revision as of 23:16, 25 June 2012

this depends on the opatering system you are using.lets assume you are a sensible person and are using a truly Unix system like FreeBSD or GNU/Linux.you need to firstly create a user/group for the database to run from. this user and group should only have access to the directories needed to run the program and host the database. also you want to have a firewall, which depending on what exactly your box is doing (some people like to run multiple services on one box generally considered a bad idea unless special circumstances.)this firewall needs to only allow the bare minimum of services through. you want to create a script that backs up the home directory of the database user because you will obviously find all the configuration files here, as well as the database itself (if you put it there, and you should) you want to put this script in the crontab and set it to run as often as you feel necessary. if you have a lot of changes, you want to set it to run daily, figure a time of least activity.now depending on the database itself, you might need/want to create users (through the database, not through the OS) for accessing the data, although if its nothing personal/sensitive you may get away with a guest log in.and remember, the rule of least privilege.this means all your programs and scripts should only be exicuted(ing) with a minimal privilege level, nothing more.also check out Tiger. its a cool program that makes all kinds of log files and lets you view them all at one place. it also runs some rootkit checkers.then there is the good old 60% rule. if you system is running over 60% of anything at any given time keep a sharp eye. it doesn't mean anything bad but its just a general rule of thumb that you should have room to handle more then your expected load just in case.anyways use common sense if you only have a few people accessing this database but your network traffic is high, something is up.