Save 
Join IIUG
 for   
 

Informix News
18 Nov 13 - ZDNet - Top 20 mobile skills in demand... Read
09 Sep 13 - telecompaper - Shaspa and Tatung have shown a new smart home platform at Ifa in Berlin. Powered by the IBM Informix software... Read
06 Sep 13 - IBM data magazine - Mission Accomplished - Miami, Florida will be the backdrop for the 2014 IIUG Informix Conference... Read
01 Feb 13 - IBM Data Magazine - Are your database backups safe? Lester Knutsen (IBM Champion) writes about database back up safety using "archecker"... Read
14 Nov 12 - IBM - IBM's Big Data For Smart Grid Goes Live In Texas... Read
3 Oct 12 - The Financial - IBM and TransWorks Collaborate to Help Louisiana-Pacific Corporation Achieve Supply Chain Efficiency... Read
28 Aug 12 - techCLOUD9 - Splunk kicks up a SaaS Storm... Read
10 Aug 12 - businessCLOUD9 - Is this the other half of Cloud monitoring?... Read
3 Aug 12 - IBM data management - Supercharging the data warehouse while keeping costs down IBM Informix Warehouse Accelerator (IWA) delivers superior performance for in-memory analytics processing... Read
2 Aug 12 - channelbiz - Oninit Group launches Pay Per Pulse cloud-based service... Read
28 May 12 - Bloor - David Norfolk on the recent Informix benchmark "pretty impressive results"... Read
23 May 12 - DBTA - Informix Genero: A Way to Modernize Informix 4GL Applications... Read
9 Apr 12 - Mastering Data Management - Upping the Informix Ante: Advanced Data Tools... Read
22 Mar 12 - developerWorks - Optimizing Informix database access... Read
14 Mar 12 - BernieSpang.com - International Informix User Group set to meet in San Diego... Read
1 Mar 12 - IBM Data Management - IIUG Heads West for 2012 - Get ready for sun and sand in San Diego... Read
1 Mar 12 - IBM Data Management - Running Informix on Solid-State Drives.Speed Up Database Access... Read
26 Feb 12 - BernieSpan.com - Better results, lower cost for a broad set of new IBM clients and partners... Read
24 Feb 12 - developerWorks - Informix Warehouse Accelerator: Continuous Acceleration during Data Refresh... Read
6 Feb 12 - PRLOG - Informix port delivers unlimited database scalability for popular SaaS application ... Read
2 Feb 12 - developerWorks - Loading data with the IBM Informix TimeSeries Plug-in for Data Studio... Read
1 Feb 12 - developerWorks - 100 Tech Tips, #47: Log-in to Fix Central... Read
13 Jan 12 - MC Press online - Informix Dynamic Server Entices New Users with Free Production Edition ... Read
11 Jan 12 - Computerworld - Ecologic Analytics and Landis+Gyr -- Suitors Decide to Tie the Knot... Read
9 Jan 12 - planetIDS.com - DNS impact on Informix / Impacto do DNS no Informix... Read
8 Sep 11 - TMCnet.com - IBM Offers Database Solution to Enable Smart Meter Data Capture... Read
1 Aug 11 - IBM Data Management Magazine - IIUG user view: Happy 10th anniversary to IBM and Informix... Read
8 Jul 11 - Database Trends and Applications - Managing Time Series Data with Informix... Read
31 May 11 - Smart Grid - The meter data management pitfall utilities are overlooking... Read
27 May 11 - IBM Data Management Magazine - IIUG user view: Big data, big time ( Series data, warehouse acceleration, and 4GLs )... Read
16 May 11 - Business Wire - HiT Software Announces DBMoto for Enterprise Integration, Adds Informix. Log-based Change Data Capture... Read
21 Mar 11 - Yahoo! Finance - IBM and Cable&Wireless Worldwide Announce UK Smart Energy Cloud... Read
14 Mar 11 - MarketWatch - Fuzzy Logix and IBM Unveil In-Database Analytics for IBM Informix... Read
11 Mar 11 - InvestorPlace - It's Time to Give IBM Props: How many tech stocks are up 53% since the dot-com boom?... Read
9 Mar 11 - DBTA - Database Administration and the Goal of Diminishing Downtime... Read
2 Feb 11 - DBTAs - Informix 11.7 Flexible Grid Provides a Different Way of Looking at Database Servers... Read
27 Jan 11 - exactsolutions - Exact to Add Informix Support to Database Replay, SQL Monitoring Solutions... Read
25 Jan 11 - PR Newswire - Bank of China in the UK Works With IBM to Become a Smarter, Greener Bank... Read
12 Oct 10 - Database Trends and Applications - Informix 11.7: The Beginning of the Next Decade of IBM Informix... Read
20 Sep 10 - planetIDS.com - ITG analyst paper: Cost/Benefit case for IBM Informix as compared to Microsoft SQL Server... Read
20 Jul 10 - IBM Announcements - IBM Informix Choice Edition V11.50 helps deploy low-cost scalable and reliable solutions for Apple Macintosh and Microsoft Windows... Read
20 Jul 10 - IBM Announcements - Software withdrawal: Elite Support for Informix Ultimate-C Edition... Read
24 May 10 - eWeek Europe - IBM Supplies Database Tech For EU Smart Grid... Read
23 May 10 - SiliconIndia - IBM's smart metering system allows wise use of energy... Read
21 May 10 - CNET - IBM to help people monitor energy use... Read
20 May 10 - ebiz - IBM Teams With Hildebrand To Bring Smart Metering To Homes Across Britain... Read
19 May 10 - The New Blog Times - Misurare il consumo energetico: DEHEMS è pronto... Read
19 May 10 - ZDNet - IBM software in your home? Pact enables five-city smart meter pilot in Europe... Read
17 March 10 - ZDNet (blog) David Morgenstern - TCO: New research finds Macs in the enterprise easier, cheaper to manage than... Read
17 March 2010 - Virtualization Review - ...key components of Big Blue's platform to the commercial cloud such as its WebSphere suite of application ser vers and its DB2 and Informix databases... Read
10 February 2010 - The Wall Street Journal - International Business Machines is expanding an initiative to win over students and professors on its products. How do they lure the college crowd?... Read


End of Support Dates

IIUG on Facebook IIUG on Twitter

RSS Feed: IIUG Forum: Spatial Users

More IIUG RSS Feeds

Discussion of issues relating to the use of the Informix Spatial DataBlade Module to store, manipulate, index and analyze multidimensional spatial data.


URL: http://www.iiug.org/rss/spatial-users.rss

Below is the latest content available from this feed:

About st_astext on esqlc
Posted by: joluinfante@gmail.com (JORGE INFANTE) - Fri, 19 Dec 2014 18:25:00 EST
Hi!

I'm trying, in a .ec program, recover a spatial field in wkt format.

But, I can't to find a sample for the select method.

I did build this code, but, don't works:



char *getSpatialValue(char *tablename, char *columname, long se_row_id)

{

static char value[8192];

char st[512];



EXEC SQL BEGIN DECLARE SECTION;

int n, xid;

short typ;

EXEC SQL END DECLARE SECTION;



$varchar sst[512];

$varchar scolumname[32];

$varchar stablename[32];

$varchar svalue[8192];



EXEC SQL include sqltypes;



SQLCODE=0;

EXEC SQL set connection 'db';

strcpy(stablename, tablename);

strcpy(scolumname, columname);

memset(value, 0, sizeof value);

sprintf(st, "select st_astext(%s) from %s where se_row_id = %ld%c", columname,

tablename, se_row_id, 0);

stcopy(st, sst);

EXEC SQL prepare s from :sst;

EXEC SQL DECLARE c CURSOR FOR s;

EXEC SQL allocate descriptor 'seldesc';

n=1;

EXEC SQL set descriptor 'seldesc' COUNT = :n;

typ = SQLUDTVAR;

xid = XID_LVARCHAR;

EXEC SQL set descriptor 'seldesc' VALUE :n



TYPE = :typ, EXTYPEID = :xid, DATA = :svalue;

EXEC SQL OPEN c using sql descriptor 'seldesc';

while(SQLCODE == 0)



{



memset(svalue, 0, sizeof svalue);



EXEC SQL FETCH c into :svalue;



if(SQLCODE == 0)



strcpy(value, svalue);



}

EXEC SQL CLOSE c;

EXEC SQL FREE c;

EXEC SQL FREE s;

EXEC SQL FREE s;

EXEC SQL set connection 'cdc';

return value;

}



I trying to get a string value, (st_astext()), but, I can't to receive it

(this program works ok with a single varchar select). Then, I did add the code

for allocate and set descriptor, but, nothing works.

What is the best method for the access to db? st_astext or get a chunck and

decode it? I can't find a sample for any method of select (only recovering a

chunck of data).



Thanks in advance

jorge infante









* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *



To post a response via email (IIUG members only):



1. Address it to spatial-users@iiug.org

2. Include the bracketed message number in the subject line: [15]



* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Use Datablade API with spatial datablade?
Posted by: mike@barrodale.com (MIKE DUNHAM-WILKIE) - Sat, 26 May 2012 11:24:12 EDT
I've been trying to upload geometry to an st_geometry column using the
datablade api.

I set my mi_lvarchar variable lvvar using

geom_to_wkb (&geom, &max_alloced, &lot_wkb_len, &lot_wkb_buf);

lvvar = mi_new_var(lot_wkb_len);

mi_set_varlen(lvvar,lot_wkb_len);

mi_set_vardata(lvvar,lot_wkb_buf);

(where geom_to_wkb is as in the spatial datablade example code)

I then do prepare the statement:

insert into table(linework,id) values(st_geomfromwkb(?,15),?);

and set:

numTypes = 2

values[0] = lvvar;

types[0] = "lvarchar";

nulls[0] = MI_FALSE;

lengths[0] = 0;

(and set values[1], etc.)

I get the error:

ERROR: USE44 XUSE44:-937:Unknown OGIS WKB byte-order byte encountered in
ST_GeomFromWKB. while executing statement <no query text>

when I execute the prepared statement:

mi_exec_prepared_statement(statement, MI_SEND_READ, 1,

numTypes,

values, lengths, nulls, types, 0, 0)

However, I when I do an mi_get_vardata on the mi_lvarchar that I'm passing up
(lvvar) and inspect the binary contents, I get:

length is 98
swap_bytes is 0
wkb_type is 5
nlines is 1
swap_bytes is 0 (for first line)
wkb_type is 2 (for first line)
....

Both swap_bytes appear to be correct.

Thanks for any help you can provide.

Mike Dunham-Wilkie




* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

To post a response via email (IIUG members only):

1. Address it to spatial-users@iiug.org
2. Include the bracketed message number in the subject line: [14]

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

A question about Informix JDBC 3.x driver
Posted by: joluinfante@gmail.com (JORGE INFANTE) - Sat, 13 Mar 2010 18:30:49 EST
Hi

With a previous version of driver, to access to spatial data, I did need
use:

Connection conn=DriverManager.
getConnection(connectionStr, user, _pw);
java.util.Map typeMap=IfxSQLData.enableTypes(conn)

Then, I can access to spatial data using:
IfxGeometry mygeo=rs.getObject(mygeocol, conn.getTypeMap())

Then, to access to spatial, I did need to register the type in the
connection.

With 3.x driver, can I (using DataSource), avoid the register the type in
the connection, and get the spatial data with rs.getXXX? I'm establishing
the connection using a DataSource, with a generic connection class. I'd like
to avoid build a specific class to persist the typeMap attribute for infomix
connection.

Excuse me by my bad english...

jorge infante
direccion de catastro
municipalidad de rosario
rosario - santa fe - argentina




* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

To post a response via email (IIUG members only):

1. Address it to spatial-users@iiug.org
2. Include the bracketed message number in the subject line: [13]

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Re: Problem with st_transform
Posted by: uleman@us.ibm.com (ROBERT ULEMAN) - Fri, 24 Apr 2009 19:58:43 EDT
Jorge,

Thank you for this question. I'd love to see more traffic on this forum!

Fortunately, this is an easy one. It really helps that you present a complete
test case; thank you for that.

You did everything right, but a tiny typographical error tripped you up. In
the srtext value you insert into the spatial_ref_sys table, you substitute the
WGS 1984 datum for the GRS 1980 datum that actually underlies the POSGAR
projections. I don't know the source of your datum specification (perhaps
copied from the srtext for srid=4?), but the last two digits in the second
spheroid parameter are transposed (_highlights_ inserted): 298.2572235_36_
should be 298.2572235_63_ . When I made that correction, the ST_Transform
function worked for me:

> select st_transform(the_geom, 4) from mytable;

(expression) 4 MULTIPOLYGON (((-60.6447115552 -32.95151404, -60.6447043468
-32.9518464663, -60.6443271611 -32.9519433611, -60.644101185 -32.9516707816,
-60.6443388958 -32.9514054106, -60.6447115552 -32.95151404)))

1 row(s) retrieved.

As you probably know, the ST_Transform() function can only transform data
between spatial reference systems that are defined on the same datum. This is
what "incompatible" in the error message refers to: in this case the datums
were not the same because their definitions differed in those two transposed
digits. Unfortunately, the Spatial DataBlade is not particularly smart about
detecting datum differences: it performs a strict character-based comparison,
not a numeric one based on knowledge of the meaning of each parameter.
Therefore, the character strings must be identical (though I don't know if the
comparison is case-sensitive) and EVEN THE PRESENCE OR ABSENCE OF TRAILING
ZEROS can make the comparison fail. In this case, the numeric values were
actually different, so that needed to be fixed anyway.

A few more comments; these may be arcane to most readers, but I hope some will
find them useful.

As you (Jorge) work for a cadastral agency, you are probably well aware that
there is a difference between the GRS80 and WGS84 geodetic references: the
inverse-flattening spheroid parameters are slightly different. To handle this
properly, a real datum transformation is needed; this requires specialized
software, and the Informix Spatial DataBlade does not provide this. For many
GIS applications, working with data of limited accuracy, the difference is
negligible; but for surveying purposes, it is not. If you know what you're
doing (and it certainly looks like you do) then you are obviously able to make
your own choice whether to accept the discrepancies that may result.

It is certainly not a requirement, but I have found it helpful to choose the
SRID as well as the coordinate offsets and units myself, rather than rely on
the function SE_CreateSRID(). These days, our internal coordinates are
expressed in big (64-bit) integers, so there is almost never a need to
fine-tune the offsets and units for individual project areas; a single choice
will suffice for all uses of a particular coordinate system. In other words,
you can cover the entire earth and still have sub-mm precision. This means
that you can tie each SRS you define one-for-one to the associated EPSG
coordinate reference, which in turn means that you can use the EPSG SRID
(22185, in this case) for your local database SRID as well. That makes it very
easy to keep track of what each SRID stands for, and if applied consistently
gives SRIDs a fixed meaning across multiple databases.

As to the choice of (X,Y) offsets and units, you chose a range of about 5500
km in X and only 45 km in Y. I'm not sure this is right: Transverse Mercator
projections are usually used for regions with a greater north-south extent. In
any case, SE_CreateSRID() produced an offset of -537 km and 5,791 km in X and
Y, respectively, and an xyunits value of over a billion, for a resolution of
less than 1 nm. Perhaps a more widely usable choice would be to follow the
published limits (from spatialreference.org) with very wide margins:
Projected Bounds: 5346660.9906, 5660186.1166, 5653339.0094, 7412332.1438
X offset: -1,000,000 (in case you really want to go west to near-zero
X-values)
Y offset: -1,000,000 (it doesn't hurt to start so far away from any reasonable
coordinates you'll ever encounter)
X/Y units: 1,000,000 (for a resolution of 1 micron)
With this, you still have enough digits to represent positive coordinates in
the billions; you have round numbers in the offsets and units, which makes it
easy to do back-of-the-envelope calculations and estimates in case you suspect
something is wrong; and no matter what kind of buffer operations you do, the
internal coordinate representation will never be a limitation. While it may be
useful at times to have the "coordinate out of range" error alert you to a bug
in a procedure or client, it is generally not a good idea to choose your
limits so tight that this becomes a reliable error trap, because if you ever
wanted to exceed the official bounds for some algorithmic or computational
reason, you would be stuck, with no escape.
With all this, your definition of the projected SRS would use this statement:

INSERT INTO spatial_references (
srid, description, auth_name, auth_srid,
falsex, falsey, xyunits, falsez, zunits, falsem, munits,
srtext
) VALUES (
22185, 'POSGAR 94 / Argentina 5', 'EPSG', 22185,
-1000000, -1000000, 1000000, -1000, 1000, -1000, 1000,
'PROJCS["Gauss_Kruger_Faja5", GEOGCS["GCS_WGS_1984",
DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137,298.257223563]],PRIMEM["Greenwich",0],UNIT["Degree",0.0174532925199433]],

PROJECTION["Transverse_Mercator"],PARAMETER["False_Easting",5500000.0],PARAMETER["False_Northing",0.0],

PARAMETER["Central_Meridian",-60.0],PARAMETER["Scale_Factor",1.0],PARAMETER["Latitude_of_Origin",-90.0],UNIT["Meter",1.0]]'
);

I hope that I've answered your original question and not made things too
confusing with all the other stuff. Don't hesitate to post further comments or
questions. If you want to take this off-line, you can contact me at the email
address below.

-Robert Uleman
Worldwide Technical Sales
Spatiotemporal Data Management
uleman@us.ibm.com




* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

To post a response via email (IIUG members only):

1. Address it to spatial-users@iiug.org
2. Include the bracketed message number in the subject line: [12]

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Problem with st_transform
Posted by: joluinfante@gmail.com (JORGE INFANTE) - Wed, 22 Apr 2009 22:54:31 EDT
Hi, forum!

We are in Rosario, Argentina, working with coordinates transformation, from
utm (meters) to geographic (lat, lon).
Working with the example in the online documentation (Example 2: Projecting
data dynamically), we did generate new records for the tables spatial_ref_sys
and spatial_references.
The steps:
a) Add a record for epsg-22185 (a new srid # 1202):
execute function se_createsrid(8223.64246705, 6335848.41072399,
5459374.50493382, 6380556.86838906, 'POSGAR 94 / Argentina 5');
b) Complete other fields:
update sde.spatial_ref_sys
set auth_name = 'EPSG', auth_srid = 22185, srtext =
'PROJCS["Gauss_Kruger_Faja5", GEOGCS["GCS_WGS_1984", DATUM["D_
WGS_1984",SPHEROID["WGS_1984",6378137,298.257223536]],
PRIMEM["Greenwich",0],UNIT["Degree",0.0174532925199433]],
PROJECTION["Transverse_Mercator"],PARAMETER["False_Easting",5500000.0],
PARAMETER["False_Northing",0.0],PARAMETER["Central_Meridian",-60.0],
PARAMETER["Scale_Factor",1.0],PARAMETER["Latitude_of_Origin",-90.0],
UNIT["Meter",1.0]]' where srid = 1202;
c) Create a new table for put data of Rosario:
create table mytable(se_row_id serial, the_geom st_geometry);
d) Add a record with values in meters:
insert into mytable(the_geom) values('1202 MULTIPOLYGON (((5439716.1698
6354410.48638, 5439717.07569 6354373.62259, 5439752.41517 6354363.09261,
5439773.35028 6354393.44852, 5439750.9494 6354422.73947, 5439716.1698
6354410.48638)))');
e) Try a transformation (meters to grades) of the record, from projection
srid-1202 to srid-4 (epsg-4326, standard):
select st_transform(the_geom, 4) from mytable;

Here, we can see the error:
(USE46) - Incompatible coordinate reference systems in function ST_Transform.
The data in srtext fields are ok.
I'm thinking we have problems with others fields of the spatial_references
table.

Can any help me?

TIA
jorge infante
dirección de catastro
municipalidad de rosario
rosario - santa fe - argentina




* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

To post a response via email (IIUG members only):

1. Address it to spatial-users@iiug.org
2. Include the bracketed message number in the subject line: [11]

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Re: BTree Index Extent Sizes with Spatial Tables
Posted by: stuart.mccann@lands.nsw.gov.au (STUART MCCANN) - Thu, 01 Mar 2007 20:32:48 EST
Unfortunately it seems that once you use a spatial type (and I suspect other
UDTs, although I haven't tested) in a table the BTree index extent sizing
algorithm gets it wrong even though the indexes are on built in data types.

For example, Given the following DDL and loading some data

create table testt (

id integer ,

shape st_multipolygon
) put shape in ( blobsbs ) extent size 81400 ;

create index testi on testt ( id ) ;

The below output from oncheck -pt clearly shows the first extent size for
table testt as 40700 pages (81400KB). The index is on an int type and has been
allocated a first extent sized of just 142 Pages (284Kb), however 2410 pages
(4820Kb) have been used to store the index.

TBLspace Report for sdecad:informix.testt

Physical Address 8:2012

Creation date 02/22/2007 14:55:09

TBLspace Flags 901 Page Locking

TBLspace contains VARCHARS

TBLspace use 4 bit bit-maps

Maximum row size 2056

Number of special columns 1

Number of keys 0

Number of extents 1

Current serial value 1

Pagesize (k) 2

First extent size 40700

Next extent size 8

Number of pages allocated 40700

Number of pages used 11125

Number of data pages 11122

Number of rows 68438

Partition partnum 8388717

Partition lockid 8388717

Extents

Logical Page Physical Page Size Physical Pages

0 8:36823 40700 40700

Index testi fragment partition cadreposdbs in DBspace cadreposdbs

Physical Address 8:2013

Creation date 02/22/2007 14:55:09

TBLspace Flags 801 Page Locking

TBLspace use 4 bit bit-maps

Maximum row size 2056

Number of special columns 0

Number of keys 1

Number of extents 12

Current serial value 1

Pagesize (k) 2

First extent size 142

Next extent size 4

Number of pages allocated 2410

Number of pages used 2409

Number of data pages 0

Number of rows 0

Partition partnum 8388718

Partition lockid 8388717

I guess that most likely this is due to the maximum row size calculated, (2048
maxlen for the spatial type and 4 bytes for the int).

So, in our systems we need to drop and recreate B-Tree indexes on highly
active (insert, update delete) spatial tables daily as they become interleaved
within their DBSpaces. Not a very high availability option though,
everybody/everything out. The drop/create index online statement causes too
many concurrency problems to contenmplate using in mission critical production
systems.

Alternatively, we can create the table with a first extent size 20 times
greater than is required for the data so that we can get a properly sized
first extent on the index. Just a bit wasteful on disk though. It would be
really nice if you could specify an extent size when creating the index or
change the index extent size algorithm to handle tables with spatial/UDTs
better.




* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

To post a response via email (IIUG members only):

1. Address it to spatial-users@iiug.org
2. Include the bracketed message number in the subject line: [10]

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Unfortunately it seems that once you use a spatial type (and I suspect other
UDTs, although I haven't tested) in a table the BTree index extent sizing
algorithm gets it wrong even though the indexes are on built in data types.

For example, Given the following DDL and loading some data

create table testt (

id integer ,

shape st_multipolygon
) put shape in ( blobsbs ) extent size 81400 ;

create index testi on testt ( id ) ;

The below output from oncheck -pt clearly shows the first extent size for
table testt as 40700 pages (81400KB). The index is on an int type and has been
allocated a first extent sized of just 142 Pages (284Kb), however 2410 pages
(4820Kb) have been used to store the index.

TBLspace Report for sdecad:informix.testt

Physical Address 8:2012

Creation date 02/22/2007 14:55:09

TBLspace Flags 901 Page Locking

TBLspace contains VARCHARS

TBLspace use 4 bit bit-maps

Maximum row size 2056

Number of special columns 1

Number of keys 0

Number of extents 1

Current serial value 1

Pagesize (k) 2

First extent size 40700

Next extent size 8

Number of pages allocated 40700

Number of pages used 11125

Number of data pages 11122

Number of rows 68438

Partition partnum 8388717

Partition lockid 8388717

Extents

Logical Page Physical Page Size Physical Pages

0 8:36823 40700 40700

Index testi fragment partition cadreposdbs in DBspace cadreposdbs

Physical Address 8:2013

Creation date 02/22/2007 14:55:09

TBLspace Flags 801 Page Locking

TBLspace use 4 bit bit-maps

Maximum row size 2056

Number of special columns 0

Number of keys 1

Number of extents 12

Current serial value 1

Pagesize (k) 2

First extent size 142

Next extent size 4

Number of pages allocated 2410

Number of pages used 2409

Number of data pages 0

Number of rows 0

Partition partnum 8388718

Partition lockid 8388717

I guess that most likely this is due to the maximum row size calculated, (2048
maxlen for the spatial type and 4 bytes for the int).

So, in our systems we need to drop and recreate B-Tree indexes on highly
active (insert, update delete) spatial tables daily as they become interleaved
within their DBSpaces. Not a very high availability option though,
everybody/everything out. The drop/create index online statement causes too
many concurrency problems to contenmplate using in mission critical production
systems.

Alternatively, we can create the table with a first extent size 20 times
greater than is required for the data so that we can get a properly sized
first extent on the index. Just a bit wasteful on disk though. It would be
really nice if you could specify an extent size when creating the index or
change the index extent size algorithm to handle tables with spatial/UDTs
better.




* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

To post a response via email (IIUG members only):

1. Address it to spatial-users@iiug.org
2. Include the bracketed message number in the subject line: [10]

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

 
Subscribe to this feed
You can subscribe to this RSS feed in a number of ways, including the following:

Drag the orange RSS button into your News Reader

Drag the URL of the RSS feed into your News Reader

Cut and paste the URL of the RSS feed into your News Reader