Blog Job Engineering
Use 32 and 64bit Oracle Client in parallel on Windows 7 64-bit for e.g. .NET Apps
Since you come here probably looking for an answer, let me give it
first. After that, if you have time, I'll explain why it turned out to
be that sort of hack, i.e. what other approaches didn't work.
How
Download and install Oracle Clients 11g (which support lower DB
versions) for 32 and 64 bits. I use the Instant clients which come with
an install.bat for 32-bit and the OUI for 64-bit.
Open an elevated console and in
%windir%\system32 create a soft link to the 64-bit oracle client installation, while in
%windir%\SysWOW64 you make a soft link to the 32-bit installation. making a soft link to a directory means using the
mklink command as explained
here. Visually it will display as such (I called my link
11g):
Edit your
PATH environment variable and add the following path to it:
c:\windows\system32\11g. Please note that
%WINDIR% will not be expanded in
%PATH%.
Source
An error has occurred in the claim providers configured from this site collection Error Message when checking names: Name does not resolve and following error displayed:
No exact match was found. Click the item(s) that did not resolve for more options.
Using People Picker displays this error:
An error has occurred in the claim providers configured from this site collection
In Event Log you will see the error below. If you look events that occurred just before the event 8307 An exception occurred in All Users claim provider when calling SPClaimProvider.FillHierarchy() you will notice few Alternate Access Mapping Errors (event id: 8059). They seems to be irrelevant but that’s exactly the cause. You are missing AAM mapping that user is using to access the site where name resolution was attempted. There might be million other reasons for error you chasing but try this first and make sure you have AAM configured. Note, batteries not included and reboot or IIS reset not required for AAM updated to take effect J
Log Name: Application
Source: Microsoft-SharePoint Products-SharePoint Foundation
Date: 1/29/2011 7:33:36 PM
Event ID: 8307
Task Category: Claims Authentication
Level: Error
Keywords:
User: Domain\loginID
Computer: serverName.domain.com
Description: An exception occurred in All Users claim provider when calling SPClaimProvider.FillHierarchy(): Object reference not set to an instance of an object..
Event Xml:
8307
14
2
47
0
0x4000000000000000
51361
Application
<serverName.domain.com>
All Users
SPClaimProvider.FillHierarchy()
Object reference not set to an instance of an object.
http://www.agileconcepts.com/Blogs/AQ/Lists/Posts/Post.aspx?ID=52
Setting Up Secure Store SharePoint 2010
One of those settings is for the Secure Store. Without it, you cannot use PerformancePoint’s unattended service account to connect to data sources. As with the 2007 version of the product, which used the application pool account as the application identity, the unattended account cannot use the application pool identity to connect to data sources. Rather, it must use a domain account, whose password is stored in the secure store. If you have created a new PerformancePoint service application and there is no secure store set up in the default proxy group, you will need to configure the secure store. You can tell whether things have been set up correctly when you go to the PPS settings page. You will see a warning indicating that the secure store hasn’t been configured.
|
Secure Store Warning |
In order to configure the secure store, you will need to create a secure store service first. Do that by clicking on New (in the ribbon) from the Manage Service Applications page. If the Edit functions in the ribbon appear inactive, make sure that the secure store service has been started. You can check this by going to the Manage services on server page in Central Administration. If it hasn’t been started, start it, and proceed with the steps below. Fill in the parameters, and click OK. Once the service is created, you can configure it.
To configure the secure store for PPS, follow these steps:
- Go to the Central Admin home page
- Under "Application Management" click "Manage Service Applications"
- Click on the Secure Store Service Proxy
- Click "Manage" in the ribbon
- You should get a message to generate a new key. Click "Edit" on the ribbon, then click "Generate New Key". You will be required to enter a pass phrase.
- When this completes, click "Edit" on the ribbon, then click "Refresh Key". Enter the same pass phrase you used in the previous step.
Configure the Unattended Service Account (necessary for using "Unattended Service Account" option on Data Sources)
- Go to the Central Admin home page
- Under "Application Management" click "Manage Service Applications"
- Click on the PerformancePoint Service Application
- Click the "Manage" button in the ribbon
- Click the first link: PerformancePoint Service Settings
- In the "Unattended Service Account" section, enter the user name and password to be used for querying data sources.
http://blogs.msdn.com/b/performancepoint/archive/2009/11/24/deploying-performancepoint-2010-soup-to-nuts.aspx
Install/Uninstall .net serviceDo the following.
go to start > All Programs > Microsoft Visual Studio 2008 (or 2010) > Visual Studio Tools
and right click on "Visual Studio version Command Prompt" and go to "Run as Administrator"
Enter the command you are using to install your service.
installutil "servicename.exe" or installutil -u "servicename.exe"
when I use the Django shell, it shows an error; this is the error:
>>> from django.db import models
>>> class Poll(models.Model):
... question = models.CharField(max_length=200)
... pub_date = models.DateTimeField('date published')
...
Traceback (most recent call last):
File "", line 1, in
File "D:\Python25\lib\site-packages\django\db\models\base.py", line 51, in __new__
kwargs = {"app_label": model_module.__name__.split('.')[-2]}
IndexError: list index out of range
Solution:
The model definition must come in an application - the error you're seeing there is that it tries to take the __name__
model_module
- which should be something like project.appname.models
for project\appname\models.py
- and get the app name, appname
. In the interactive console, the module's __name__
is '__main__'
- so it fails.
To get around this, you'll need to specify the app_label
yourself in the Meta
class;
>>> from django.db import models
>>> class Poll(models.Model):
... question = models.CharField(max_length=200)
... pub_date = models.DateTimeField('date published')
... class Meta:
... app_label = 'test'
For explanation of why you can do that, look at that file mentioned in the traceback, D:\Python25\lib\site-packages\django\db\models\base.py
:
if getattr(meta, 'app_label', None) is None:
# Figure out the app_label by looking one level up.
# For 'django.contrib.sites.models', this would be 'sites'.
model_module = sys.modules[new_class.__module__]
kwargs = {"app_label": model_module.__name__.split('.')[-2]}
else:
kwargs = {}
(Where meta
is the Meta
class, see just above in that file.)
source = http://stackoverflow.com/questions/4382032/defining-a-model-class-in-django-shell-fails
Labels: Django Class
Ada bugs ketika export table yang berisi kolom bertipe BLOB. Isi kolom tersebut tidak dapat diexport ke database lain. Solusinya adalah sbb:
When you use old version of exp to export tables with LOB column from Oracle 9.2.0.5 or higher version, you will get an error "EXP-00003 : no storage definition found for segment .....", actually this is an Oracle bug, you could temporary get it resolved by replace a view "exu9tne", as following:
Before exporting, run the following SQL under sys:
CREATE OR REPLACE VIEW exu9tne (
tsno, fileno, blockno, length) AS
SELECT ts#, segfile#, segblock#, length
FROM sys.uet$
WHERE ext# = 1
UNION ALL
SELECT * FROM SYS.EXU9TNEB
/
After exporting, run the following to restore the view definition according to Metalink Notes.
CREATE OR REPLACE VIEW exu9tne (
tsno, fileno, blockno, length) AS
SELECT ts#, segfile#, segblock#, length
FROM sys.uet$
WHERE ext# = 1
/
Of cause, if you make the change, you should roll back the view definition after your work done.
Source
Transaction Aware Table-adapters in .NET 2.0
Introduction
Much has been said about missing support for atomic transactions in .NET 2.0 when using table adapters that get generated by the dataset designer. This article provides a ready-to-use base-class that by injecting it into a table-adapter's inheritance line equips it with transaction capabilities.
Discussion of Approaches
One way to get around the aforementioned limitations is to configure your server to have DTC (Distributed Transaction Coordination) enabled. In that case, you can use TransactionScope
instances guarding actions like modifications through table adapters. However, if you don't want to or cannot change the server's configuration, TransactionScope
is not an option.
Others on The Code Project and other sites have explained how to enable transactions for table adapters. The basic idea always is to attach an SqlTransaction
to all commands of a table-adapter. If multiple table-adapters are used, share the same transaction across all commands of all adapters. What makes this a little difficult is that a generated table-adapter keeps its SqlDataAdapter
property Adapter
private, but transaction code requires modifying the property.
There are several approaches to solve this issue. For instance Avoiding DTC with Typed DataSets and SQL Server 2000 suggests adding transaction code in the form of partial classes. As the partial extension lives in the same scope as the original table-adapter, it can also access its private
properties. The major disadvantage here is that partial classes have to be created for each generated table adapter.
A much more elegant approach is mentioned in this post (unfortunately in German): use reflection to get your hands at those properties. The article also mentions a small detail I found very interesting: when editing a table-adapter in designer-mode, you are actually able to change its base class! So instead of the default base-class Component
you can fill in your own base-class. If you now provide such a base-class that is derived from Component
and implements the transaction stuff through reflection, usage becomes simple and elegant. And this is exactly what I did: putting together such a handy base-class called TransactionSupport
.
Using the Code
To take advantage of the base class is very simple: First download the class source code and drop it into your project. Then, for all tables in your dataset that require transaction support, change the table adapter's base-class like this:
Select the table-adapter (not the table) in the dataset designer, such that it becomes visible in the properties view.
- Change the property
BaseClass
from System.ComponentModel.Component
to the class BizWiz.TransactionSupport
I provided like is shown in the screenshot.
And that's it. Do this for each table adapter required. Then you can write code of the following pattern:
customerTableAdapter.BeginTransaction();
orderTableAdapter.Transaction = customerTableAdapter.Transaction;
try
{
customerTableAdapter.CommitTransaction();
}
catch( Exception e )
{
customerTableAdapter.RollbackTransaction();
}
You can choose any of the participating table-adapters to perform commit or rollback, but I make it good practice to have one dedicated transaction master, that is doing it all: BeginTransaction()
, CommitTransaction()
and RollbackTransaction()
.
Oracle 10g very slow when generating report using Crystal ReportsRecently we have upgraded our production database from Oracle 9i to Oracle 10g.After this, our reports response time got to very bad.After searching google it reveals that Oracle 10g has a bug in ODBC connection.
I tried to update the ODBC:
1. Upgrade ODBC oracle 10g to version 10.2.0.3.0 (12-Dec-2006) --> the speed still SLOW
2. Set parameter_optimizer_ignore_hints=TRUE either from Initialization Parameter or create trigger like this:
create or replace trigger sys.logon_trigger
after logon on application_user.schema
begin
execute immediate
'alter session set "_optimizer_ignore_hints"=TRUE';
end;
/
Application_user is your schema's name.
After creating the trigger my report generated report faster than before.
I got this from here