The phrase is explained by the rest of the passage. I'll give a larger excerpt of the paper for context.
A seminal paper by Selman, Mitchell and Levesque considered the empirical performance of DPLL-type solvers running on uniform-random SAT instances. It found a strong correlation between the instance's hardness and the ratio of the number of clauses to the number of variables in the instance. Furthermore, it demonstrated that the hardest region (e.g., for random 3-SAT, a clauses-to-variables ratio of roughly 4.26) corresponds exactly to a phase transition in a non-algorithm-specific theoretical property of the instance: the probability that a randomly-generated formula having a given ratio will be satisfiable.
The non-algorithm-specific property is the probability that a randomly-generated formula having a given clauses-to-variables ratio will be satisfiable. While the perceived hardness of a particular instance will vary depending on the algorithm used to solve it, the probability that a randomly generated SAT instance is satisfiable does not change if you use different SAT solvers. Yet these two properties are somehow linked.
An example of an algorithm-specific property would be the apparent hardness of pigeon-hole problems encoded as SAT instances. It's been proven that deciding the unsatisfiability of such instances requires exponentially long proofs in the resolution proof system, and thus exponentially long runtimes of solvers implementing that system. However, a solver that implements a simple procedure to break the symmetry of such instances (such as forcing a specific ordering of the pigeons in the holes) allows unsatisfiability to be decided in polynomial time.