Data & Services - Solr Server
Configure Solr Server settings for ColdFusion search functionality
Overview
The Solr Server section in ColdFusion Administrator allows you to configure Apache Solr integration for full-text search capabilities. Solr provides powerful search features including faceting, hit highlighting, and advanced query parsing. ColdFusion includes a built-in Solr server, or you can configure it to connect to an external Solr instance for better scalability and performance.
Server Configuration
Configure the connection to your Solr server instance.
Solr Server Host
Production: Use dedicated Solr server or cluster
localhost (built-in)solr.example.com (external)10.0.1.50 (IP address)Port
Context Root
/solr unless customizedhttp://localhost:8983/solrConnection Timeout (seconds)
Remote Solr: 90-120 seconds
Security Settings
Configure secure connections to your Solr server.
Use HTTPS
- Default
- Disabled (HTTP)
- Recommendation
- Always enable for production
Encrypts communication between ColdFusion and Solr. Essential when Solr is on a remote server to protect search queries and data.
Authentication
- Options
- Basic Auth, PKI, Kerberos
- Recommendation
- Enable authentication for external Solr
Restrict access to Solr admin and query interfaces. Configure in Solr's security.json file.
Working with Solr Collections
Use ColdFusion's built-in functions to interact with Solr collections.
Creating a Solr Collection
Create and configure a new Solr collection for search functionality:
// Create a new Solr collection
collectionCreate(
name="products",
path="/var/solr/data/products"
);
// Index files into the collection
collectionIndexPath(
name="products",
path="/var/www/products",
recurse=true,
extensions="pdf,doc,docx,html"
);
// Search the collection
searchResults = collectionSearch(
name="products",
criteria="laptop computer",
maxRows=20,
startRow=1
);
// Display results
for (result in searchResults.results) {
writeOutput("<h3>#result.title#</h3>");
writeOutput("<p>#result.summary#</p>");
writeOutput("<p>Score: #result.score#</p>");
}<!--- Create a new Solr collection --->
<cfcollection
action="create"
collection="products"
path="/var/solr/data/products">
<!--- Index files into the collection --->
<cfindex
action="refresh"
collection="products"
key="/var/www/products"
type="path"
recurse="true"
extensions="pdf,doc,docx,html">
<!--- Search the collection --->
<cfsearch
collection="products"
criteria="laptop computer"
maxrows="20"
startrow="1"
name="searchResults">
<!--- Display results --->
<cfloop query="searchResults">
<h3><cfoutput>#title#</cfoutput></h3>
<p><cfoutput>#summary#</cfoutput></p>
<p>Score: <cfoutput>#score#</cfoutput></p>
</cfloop>Indexing Database Content
Index database records for full-text search:
// Get products from database
products = queryExecute("
SELECT id, name, description, category
FROM products
WHERE active = 1
");
// Index each product
collectionIndexCustom(
name="products",
body=products.description,
title=products.name,
key=products.id,
custom1=products.category,
type="custom"
);
// Search with filtering
searchResults = collectionSearch(
name="products",
criteria="wireless mouse",
custom1="Electronics" // Filter by category
);<!--- Get products from database --->
<cfquery name="products">
SELECT id, name, description, category
FROM products
WHERE active = 1
</cfquery>
<!--- Index each product --->
<cfindex
action="update"
collection="products"
query="products"
body="description"
title="name"
key="id"
custom1="category"
type="custom">
<!--- Search with filtering --->
<cfsearch
collection="products"
criteria="wireless mouse"
custom1="Electronics"
name="searchResults">Performance Optimization
Best practices for optimizing Solr performance.
Memory Allocation
solr.in.sh or solr.in.cmdSOLR_HEAP="4g" for 8GB system- Small indexes (<10GB): 2-4GB heap
- Medium indexes (10-100GB): 8-16GB heap
- Large indexes (>100GB): 32GB+ heap
- Leave room for OS file system cache (improves query performance)
Commit Strategy
- Hard Commit
- 30-60 seconds (makes changes durable)
- Soft Commit
- 1-5 seconds (makes changes visible)
Balance between search freshness and indexing performance. Use soft commits for near-real-time search.
Cache Configuration
- Query Result Cache
- Cache frequent searches
- Filter Cache
- Cache common filters
Configure in solrconfig.xml. Monitor cache hit rates and adjust sizes accordingly.
Sharding & Replication
- Sharding
- Split large indexes across servers
- Replication
- Create read replicas for high availability
Use SolrCloud for distributed search and automatic failover in production.
Best Practices
- Run Solr on a separate server or container for production environments
- Use HTTPS for Solr connections when the server is not on localhost
- Enable authentication to prevent unauthorized access
- Configure firewall rules to restrict Solr port access
- Use SolrCloud for high availability and scalability
- Monitor Solr performance metrics (query time, cache hit rate, indexing rate)
- Regularly optimize indexes to improve query performance
- Backup Solr indexes and configuration regularly
- Monitor disk space usage (indexes can grow large)
- Review and analyze Solr query logs for optimization opportunities
- Set up alerting for Solr server availability and performance issues
Common Issues & Solutions
Unable to Connect to Solr Server
- Verify Solr is running:
curl http://localhost:8983/solr/ - Check host, port, and context root settings in CF Administrator
- Verify firewall rules allow connection on Solr port
- Check Solr logs for startup errors
- Ensure correct protocol (HTTP vs HTTPS) is configured
Slow Search Performance
- Increase Solr JVM heap size for better performance
- Optimize Solr indexes periodically
- Enable query result caching in solrconfig.xml
- Use more specific search queries (avoid wildcards at beginning)
- Consider sharding large indexes across multiple servers
- Monitor and tune commit frequency
Out of Memory Errors
OutOfMemoryError, indexing or search operations fail- Increase Solr JVM heap size in solr.in.sh or solr.in.cmd
- Reduce cache sizes in solrconfig.xml if too aggressive
- Index documents in smaller batches
- Add more RAM to Solr server if possible
- Enable garbage collection logging to identify memory issues